What I am trying to do: use Android's MediaCodec to encode raw PCM audio samples into a raw AAC file.
The problem I have: when I use FFMPEG to pack the generated raw AAC file into an M4A container, FFMPEG complains about missing codec parameters in the file.
Details:
Since I can't find any MediaCodec sample code for the audio encoder that generates an output AAC file, I tried to modify the video encoder into an audio encoder. The original code is here: source_code
I configured the audio encoder like this:
mEncoderFormat = MediaFormat.createAudioFormat("audio/mp4a-latm", (int)mAudioSampleRate, 2);
// redundant?
mEncoderFormat.setString(MediaFormat.KEY_MIME, "audio/mp4a-latm");
mEncoderFormat.setInteger(MediaFormat.KEY_AAC_PROFILE,
MediaCodecInfo.CodecProfileLevel.AACObjectELD);
mEncoderFormat.setInteger(MediaFormat.KEY_SAMPLE_RATE, kSampleRates);
mEncoderFormat.setInteger(MediaFormat.KEY_BIT_RATE, kBitRates);
mEncoderFormat.setInteger(MediaFormat.KEY_CHANNEL_COUNT, 2);
testEncoderWithFormat("audio/mp4a-latm", mEncoderFormat);
try {
codec.configure(
mEncoderFormat,
null /* surface */,
null /* crypto */,
MediaCodec.CONFIGURE_FLAG_ENCODE);
} catch (IllegalStateException e) {
Log.e(TAG, "codec '" + componentName + "' failed configuration.");
return;
}
Log.d(TAG, " testEncoder configured with format = " + format);
Then I feed the encoder with 10ms worth of PCM samples per frame. The encoder takes each frame, generates a frame of bitstream, and I write the bitstream into an FileOutputStream. The loop continues until the end of the input file.
The code runs to the finish. I do 'adb pull' to get the generated AAC file from the device to my PC, and use FFMPEG to read it. Below is the command and the error FFMPEG spits out:
$ ffmpeg -f aac -i BlessedNoColor_nexus7_api18.aac
ffmpeg version N-45739-g04bf2e7 Copyright (c) 2000-2012 the FFmpeg developers
built on Oct 20 2012 00:20:36 with gcc 4.7.2 (GCC)
configuration: --enable-gpl --enable-version3 --disable-pthreads --enable-runt
ime-cpudetect --enable-avisynth --enable-bzlib --enable-frei0r --enable-libass -
-enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libfreetype --enab
le-libgsm --enable-libmp3lame --enable-libnut --enable-libopenjpeg --enable-libo
pus --enable-librtmp --enable-libschroedinger --enable-libspeex --enable-libtheo
ra --enable-libutvideo --enable-libvo-aacenc --enable-libvo-amrwbenc --enable-li
bvorbis --enable-libvpx --enable-libx264 --enable-libxavs --enable-libxvid --ena
ble-zlib
libavutil 51. 76.100 / 51. 76.100
libavcodec 54. 67.100 / 54. 67.100
libavformat 54. 33.100 / 54. 33.100
libavdevice 54. 3.100 / 54. 3.100
libavfilter 3. 19.103 / 3. 19.103
libswscale 2. 1.101 / 2. 1.101
libswresample 0. 16.100 / 0. 16.100
libpostproc 52. 1.100 / 52. 1.100
[aac @ 00000000002efae0] channel element 2.0 is not allocated
[aac @ 00000000003cf520] decoding for stream 0 failed
[aac @ 00000000003cf520] Could not find codec parameters for stream 0 (Audio: aac, 0 channels, s16): unspecified sample rate
Consider increasing the value for the 'analyzeduration' and 'probesize' options
[aac @ 00000000003cf520] Estimating duration from bitrate, this may be inaccurate
BlessedNoColor_nexus7_api18.aac: could not find codec parameters
My questions:
Any helps will be deeply appreciated. It'd be great if there is a sample project that does what I'm trying to do here. If my source code can help you help me, I'll post it. I need to do some cleanup. Thanks!
Edit: Changed the title from "Elementary AAC file generated by MediaCodec missing codec parameters" to "How to generate the AAC ADTS elementary stream with Android MediaCodec"
I finally generated AAC files that are playable on both the Android device and the Windows host computer. I am posting my solution here, hoping it could help others.
First, my previous assumption that the Android MediaCodec encoder generates the elementary AAC stream was not accurate. The MediaCodec encoder generates the raw AAC stream. That's why the files could not be played. The raw AAC stream needs to be converted into a playable format, such as the ADTS stream. I have changed the title of this post to reflect my new understanding. There was another post that asked a similar question, and had an excellent answer. However, a novice may not necessarily understand the brief descriptions there. I didn't quite get it the 1st time I read that post.
So, in order to generate an AAC bitstream that can be played by a media player, I started from the EncoderTest example given by fadden in his 1st comment, but modified the original code to add the ADTS header per output frame (access unit), and to write the resulting stream into a file (replaced lines 248 through 267 of the original code with the following code snippet):
if (index >= 0) {
int outBitsSize = info.size;
int outPacketSize = outBitsSize + 7; // 7 is ADTS size
ByteBuffer outBuf = codecOutputBuffers[index];
outBuf.position(info.offset);
outBuf.limit(info.offset + outBitsSize);
try {
byte[] data = new byte[outPacketSize]; //space for ADTS header included
addADTStoPacket(data, outPacketSize);
outBuf.get(data, 7, outBitsSize);
outBuf.position(info.offset);
mFileStream.write(data, 0, outPacketSize); //open FileOutputStream beforehand
} catch (IOException e) {
Log.e(TAG, "failed writing bitstream data to file");
e.printStackTrace();
}
numBytesDequeued += info.size;
outBuf.clear();
codec.releaseOutputBuffer(index, false /* render */);
Log.d(TAG, " dequeued " + outBitsSize + " bytes of output data.");
Log.d(TAG, " wrote " + outPacketSize + " bytes into output file.");
}
else if (index == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
}
else if (index == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
codecOutputBuffers = codec.getOutputBuffers();
}
Outside the loop, I defined the function addADTStoPacket like this:
/**
* Add ADTS header at the beginning of each and every AAC packet.
* This is needed as MediaCodec encoder generates a packet of raw
* AAC data.
*
* Note the packetLen must count in the ADTS header itself.
**/
private void addADTStoPacket(byte[] packet, int packetLen) {
int profile = 2; //AAC LC
//39=MediaCodecInfo.CodecProfileLevel.AACObjectELD;
int freqIdx = 4; //44.1KHz
int chanCfg = 2; //CPE
// fill in ADTS data
packet[0] = (byte)0xFF;
packet[1] = (byte)0xF9;
packet[2] = (byte)(((profile-1)<<6) + (freqIdx<<2) +(chanCfg>>2));
packet[3] = (byte)(((chanCfg&3)<<6) + (packetLen>>11));
packet[4] = (byte)((packetLen&0x7FF) >> 3);
packet[5] = (byte)(((packetLen&7)<<5) + 0x1F);
packet[6] = (byte)0xFC;
}
I also added code to control how to stop generating the AAC ADTS stream, but that's application specific, so I won't detail here. With all these changes, the generated AAC files can be played on the Android device, on my Windows PC, and ffmpeg is happy with them.