I want to stream video recording from my android phone to network media server.
The first problem is that when setting MediaRecorder
output to socket, the stream is missing some mdat
size headers. This can be fixed by preprocessing that stream locally and adding missing data to stream in order to produce valid output stream.
The question is how to proceed from there.
How can I go about output that stream as an RTMP stream?
First, let's unwind your question. As you've surmised, RTMP isn't currently supported by Android. You can use a few side libraries to add support, but these may not be full implementations or have other undesirable side effects and bugs that cause them to fail to meet your needs.
The common alternative in this case is to use RTSP. It provides a comparable session format that has its own RFC, and its packet structure when combined with RTP is very similar (sans some details) to your desired protocol. You could perform the necessary fixups here to transmute RTP/RTSP into RTMP, but as mentioned, such effort is currently outside the development scope of your application.
So, let's assume you would like to use RTMP (invalidating this thread) and that the above-linked library does not meet your needs.
You could, for example, follow this tutorial for recording and playback using Livu, Wowza, and Adobe Flash Player, talking with the Livu developer(s) about licensing their client. Or, you could use this client library and its full Android recorder example to build your client.
To summarize:
Best of luck with your application. I admit that I have a less than comprehensive understanding of all of these libraries, but these appear to be the standard solutions in this space at the time of this writing.
According to the OP, walking the RTMP library set:
In short: more work is needed. Other answers, and improvements upon these examples, are what's needed here.