In my application, we need to display the Video frame receives from server to our android application,
Server is sending video data @ 50 frame per second, having encoded in WebM i.e. using libvpx to encode and decode the images,
Now after decoding from libvpx its getting YUV data, that we can displayed over the image layout,
the current implementation is something like this,
In JNI / Native C++ code, we are converting YUV data to RGB Data In Android framework, calling
public Bitmap createImgae(byte[] bits, int width, int height, int scan) {
Bitmap bitmap=null;
System.out.println("video: creating bitmap");
//try{
bitmap = Bitmap.createBitmap(width, height,
Bitmap.Config.ARGB_8888);
bitmap.copyPixelsFromBuffer(ByteBuffer.wrap(bits));
//}catch(OutOfMemoryError ex){
//}
System.out.println("video: bitmap created");
return bitmap;
}
To create the bitmap image ,
to display the image over imageView using following code,
img = createImgae(imgRaw, imgInfo[0], imgInfo[1], 1);
if(img!=null && !img.isRecycled()){
iv.setImageBitmap(img);
//img.recycle();
img=null;
System.out.println("video: image displayed");
}
My query is, overall this function is taking approx 40 ms, is there any way to optimize it,
1 -- Is there any way to display YUV data to imageView ?
2 -- Is there any other way to create Image( Bitmap image) from RGB data ,
3 -- I believe i am always creating image, but i suppose i should create bitmap only once and do / supply new buffer always, as and when we received.
please share your views.
Following code solve your problem and it may take less time on Yuv Format data because YuvImage class is provided with Android-SDK.
You can try this,
ByteArrayOutputStream out = new ByteArrayOutputStream();
YuvImage yuvImage = new YuvImage(data, ImageFormat.NV21, width, height, null);
yuvImage.compressToJpeg(new Rect(0, 0, width, height), 50, out);
byte[] imageBytes = out.toByteArray();
Bitmap image = BitmapFactory.decodeByteArray(imageBytes, 0, imageBytes.length);
iv.setImageBitmap(image);
or
void yourFunction(byte[] data, int mWidth, int mHeight)
{
int[] mIntArray = new int[mWidth*mHeight];
// Decode Yuv data to integer array
decodeYUV420SP(mIntArray, data, mWidth, mHeight);
//Initialize the bitmap, with the replaced color
Bitmap bmp = Bitmap.createBitmap(mIntArray, mWidth, mHeight, Bitmap.Config.ARGB_8888);
// Draw the bitmap with the replaced color
iv.setImageBitmap(bmp);
}
static public void decodeYUV420SP(int[] rgba, byte[] yuv420sp, int width,
int height) {
final int frameSize = width * height;
for (int j = 0, yp = 0; j < height; j++) {
int uvp = frameSize + (j >> 1) * width, u = 0, v = 0;
for (int i = 0; i < width; i++, yp++) {
int y = (0xff & ((int) yuv420sp[yp])) - 16;
if (y < 0)
y = 0;
if ((i & 1) == 0) {
v = (0xff & yuv420sp[uvp++]) - 128;
u = (0xff & yuv420sp[uvp++]) - 128;
}
int y1192 = 1192 * y;
int r = (y1192 + 1634 * v);
int g = (y1192 - 833 * v - 400 * u);
int b = (y1192 + 2066 * u);
if (r < 0)
r = 0;
else if (r > 262143)
r = 262143;
if (g < 0)
g = 0;
else if (g > 262143)
g = 262143;
if (b < 0)
b = 0;
else if (b > 262143)
b = 262143;
// rgb[yp] = 0xff000000 | ((r << 6) & 0xff0000) | ((g >> 2) &
// 0xff00) | ((b >> 10) & 0xff);
// rgba, divide 2^10 ( >> 10)
rgba[yp] = ((r << 14) & 0xff000000) | ((g << 6) & 0xff0000)
| ((b >> 2) | 0xff00);
}
}
}