5

I'm using the Android MediaCodec library to decode a video stored on the file system. I get an output buffer that looks legit (with proper bufferinfo.offset and size). Its format seems to be 256 (which is JPEG). I tried decoding it with BitmapFactory.decodeByteArray, but the result was null.

Does anyone know the correct way to ascertain the format of the output buffer? What's the correct way to start decoding the output byte arrays?

1 Answer 1

3

The MediaCodec color formats are defined by the MediaCodecInfo.CodecCapabilities class. 256 is used internally, and generally doesn't mean that you have a buffer of JPEG data. The confusion here is likely because you're looking at constants in the ImageFormat class, but those only apply to camera output. (For example, ImageFormat.NV16 is a YCbCr format, while COLOR_Format32bitARGB8888 is RGB, but both have the numeric value 16.)

Some examples of MediaCodec usage, including links to CTS tests that exercise MediaCodec, can be found here. On some devices you will not be able to decode data from the ByteBuffer output, and must instead decode to a Surface.

Sign up to request clarification or add additional context in comments.

2 Comments

I actually figured this out a little bit. MediaCodec has an "event" called "output format changed". Here, you get access to a new MediaFormat object that should have the color format of the output buffer. In my case, it wasn't 256 (JPEG data). It was something from the MediaCodecInfo.CodecCapabilities class - developer.android.com/reference/android/media/…
Yup. If you look at EncodeDecodeTest.java on the page I linked, it gets the format at line 619, and passes it to checkFrame(). At line 966 the color format is compared against the known formats to decide how to proceed.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.