So far I am able to setup a MediaCodec to encode a video stream. The aim is to save my user generated artwork into a video file.
I use android Bitmap objects of the user artwork to push frames into the stream.
See the code snippet I use at the bottom of this post (it is the full code nothing is trimmed):
MediaCodec uses ByteBuffer to deal with video/audio streams.
Bitmaps are based on int[] which if converted to byte[] will require x4 the size of the int[]
I did some research to figure out what contracts are there in place for the ByteBuffer when dealing with video/audio streams in MediaCodec, but the information is almost close to zilch.
So, what are the ByteBuffer usage contracts in MediaCodec?
Does specifying the frame dimensions in the MediaFormat automatically mean that the ByteBuffers have width * height * 4 bytes capacity?
(I use a bitmap object at a time for each frame)
Thanks for any help.
(edited, code added)
import java.io.ByteArrayOutputStream;
import java.io.DataOutputStream;
import java.io.File;
import java.io.FileOutputStream;
import java.nio.ByteBuffer;
import android.graphics.Rect;
import android.graphics.Bitmap.CompressFormat;
import android.media.MediaCodec;
import android.media.MediaCodec.BufferInfo;
import android.media.CamcorderProfile;
import android.media.MediaCodecInfo;
import android.media.MediaFormat;
import android.util.Log;
import android.view.View;
public class VideoCaptureManager {
private boolean running;
private long presentationTime;
public void start(View rootView, String saveFilePath){
Log.e("OUT", saveFilePath);
this.running = true;
this.presentationTime = 0;
this.capture(rootView, saveFilePath);
}
private void capture(final View rootView, String saveFilePath){
if(rootView != null){
rootView.setDrawingCacheEnabled(true);
final Rect drawingRect = new Rect();
rootView.getDrawingRect(drawingRect);
try{
final File file = new File(saveFilePath);
if(file.exists()){
// File exists return
return;
} else {
File parent = file.getParentFile();
if(!parent.exists()){
parent.mkdirs();
}
}
new Thread(){
public void run(){
try{
DataOutputStream dos = new DataOutputStream(new FileOutputStream(file));
MediaCodec codec = MediaCodec.createEncoderByType("video/mp4v-es");
MediaFormat mediaFormat = null;
if(CamcorderProfile.hasProfile(CamcorderProfile.QUALITY_720P)){
mediaFormat = MediaFormat.createVideoFormat("video/mp4v-es", 720, 1280);
} else {
mediaFormat = MediaFormat.createVideoFormat("video/mp4v-es", 480, 720);
}
mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, 700000);
mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, 10);
mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar);
mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 5);
codec.configure(mediaFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
codec.start();
ByteBuffer[] inputBuffers = codec.getInputBuffers();
ByteBuffer[] outputBuffers = codec.getOutputBuffers();
while(VideoCaptureManager.this.running){
try{
int inputBufferIndex = codec.dequeueInputBuffer(-2);
if(inputBufferIndex >= 0){
// Fill in the bitmap bytes
// inputBuffers[inputBufferIndex].
ByteArrayOutputStream baos = new ByteArrayOutputStream();
rootView.getDrawingCache().compress(CompressFormat.JPEG, 80, baos);
inputBuffers[inputBufferIndex].put(baos.toByteArray());
codec.queueInputBuffer(inputBufferIndex, 0, inputBuffers[inputBufferIndex].capacity(), presentationTime, MediaCodec.BUFFER_FLAG_CODEC_CONFIG);
presentationTime += 100;
}
BufferInfo info = new BufferInfo();
int outputBufferIndex = codec.dequeueOutputBuffer(info, -2);
if(outputBufferIndex >= 0){
// Write the bytes to file
byte[] array = outputBuffers[outputBufferIndex].array(); // THIS THORWS AN EXCEPTION. WHAT IS THE CONTRACT TO DEAL WITH ByteBuffer in this code?
if(array != null){
dos.write(array);
}
codec.releaseOutputBuffer(outputBufferIndex, false);
} else if(outputBufferIndex == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED){
outputBuffers = codec.getOutputBuffers();
} else if(outputBufferIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED){
// codec format is changed
MediaFormat format = codec.getOutputFormat();
}
Thread.sleep(100);
}catch(Throwable th){
Log.e("OUT", th.getMessage(), th);
}
}
codec.stop();
codec.release();
codec = null;
dos.flush();
dos.close();
}catch(Throwable th){
Log.e("OUT", th.getMessage(), th);
}
}
}.start();
}catch(Throwable th){
Log.e("OUT", th.getMessage(), th);
}
}
}
public void stop(){
this.running = false;
}
}
The exact layout of the ByteBuffer
is determined by the codec for the input format you've chosen. Not all devices support all possible input formats (e.g. some AVC encoders require planar 420 YUV, others require semi-planar). Older versions of Android (<= API 17) didn't really provide a portable way to software-generate video frames for MediaCodec
.
In Android 4.3 (API 18), you have two options. First, MediaCodec
now accepts input from a Surface, which means anything you can draw with OpenGL ES can be recorded as a movie. See, for example, the EncodeAndMuxTest sample.
Second, you still have the option of using software-generated YUV 420 buffers, but now they're more likely to work because there are CTS tests that exercise them. You still have to do runtime detection of planar or semi-planar, but there's really only two layouts. See the buffer-to-buffer variants of the EncodeDecodeTest for an example.