Android doesn't support many media file formats (files like .mkv), by default. But players like MXPlayer and MoboPlayer enables you to play such files. How do these applications work? Is there any programming tutorials or articles explaining how it is done?
Android does not support all the media formats by default. Hence to support an unsupported format, applications like MoboPlayer perform standard media player tasks themselves using a combination of Java and native code via JNI.
To understand more, let us see what a media player does to play a media file and how these are performed by MoboPlayer like apps in Android.
Typical tasks of a Media Player
A player needs to perform following tasks (a) Demux the file format and extract the video/audio (b) Decode the video and the audio (c) Display the video and Play the audio. Let us consider how these three areas can be handled in case of MoboPlayer like player.
File format Demux
Android does not support all the file format containers, by default. For example, ASF (file format of WMV files) is not supported. Hence a player needs to have it's own demuxer to do the job. libavformat (used by FFMPEG) is a leading open source demux library.
Video/Audio decode
libavcodec (used by FFMPEG) is a leading open source decoder library that decodes demuxed stream and produces uncompressed raw output frames. Hence it is often used by players like MoboPlayer.
Video display
There are two options to display a video. Players use either SurfaceView
buffers or OpenGL accelerated buffers. Android provides a cast of java Surface object (abstraction of underlying surface flinger object) to an internal native object called NativeWindow
, which can then be accessed by the native code.
ICS (Android 4.0) onwards, Android provides access to the OpenGL accelerated buffers (SurfaceTexure
and TextureView
) too. These can be used to display the video as well.
Audio playout
Players can directly interact with the AudioTrack
object provided at the Java level to play the decoded audio samples.