I am building an Android application where an ExoPlayer plays a video onto the surface of a SurfaceView, and I am investigating whether it is possible to dynamically blur the playing video.
Blurring techniques that involve first generating a bitmap of the view to blur will not work, since the surface part of a SurfaceView does not appear in bitmaps.
Surfaces and views used to have built-in blurring effects in older versions of Android (e.g. Surface.FX_SURFACE_BLUR), but seem to have been deprecated in newer APIs.
Can anyone share some insight on how a surface can be dynamically blurred? Thank you.
There are lots of questions on StackOverflow with small bits and pieces of what needs to be done. I'll go over the method I used and hopefully it will be useful to somebody.
If this was a static blur of a video frame, it would be sufficient to play the video in a TextureView
, use the .getBitmap()
function and blur the resulting Bitmap using a tool such as Renderscript. However, .getBitmap()
is performed on the main UI thread, and hence lags the video whose frames it is trying to copy.
To perform a blur for every frame, the best approach seems to be to use a GLSurfaceView with a custom renderer. I used the code available in VidEffects pointed to from this answer as a good starting point.
Blurs with large radii can be very computationally intensive. That is why I first approached performing the blur with two separate fragment shaders (one to blur horizontally, and one to blur the result vertically). I actually ended up using only one fragment shader to apply a 7x7 Gaussian kernel. A very important thing to keep in mind if your GLSurfaceView
is large is to call setFixedSize()
on the GLSurfaceView
's SurfaceHolder
to make its resolution lower than that of the screen. The result does not look very pixelated since it is blurred anyway, but the performance increase is very significant.
The blur I made managed to play 24fps on most devices, with setFixedSize()
specifying its resolution to be 100x70.