Video Capturing + Uploading + Processing + Streaming back - .NET & C#

amazedsaint picture amazedsaint · Jul 29, 2010 · Viewed 9.3k times · Source

We are trying to find out any technologies/libraries available in .NET stack (even wrappers on top of 3rd party dlls) that'll help us to build an app that can

  • 1 - Capture an image from a user's video device
  • 2 - Upload it realtime to a server
  • 3 - Process the video (in the server) - eg: Adding a watermark to the video
  • 4 - Stream it back to the user/other users

Preferably, the time delay/latency between step2 and 4 should be minimal

The first requirement (capturing) seems pretty straight forward. The challenge is identifying a suitable way to do the upload, do the processing, and stream it back. Any valid suggestions or ideas?

Recently came acrsoss FFmpeg library, and it has a C# wrapper. Does FFmpeg can be used to do the processing side?

Answer

Henrik picture Henrik · Aug 5, 2010

I would go about it this way:

  1. Use silverlight or flash to capture the video camera input, e.g. as detailed here.
  2. You can send the byte-stream over a socket that your server is listening to.
  3. On the receiving end, just use the socket-accepting program as a router-program with a number of listening workers connected. Between workers and router-program, e.g. AMQP with RabbitMQ. Send asynchronous messages (e.g. with reactive extensions) with e.g. the stream encoding to the rabbit-node, which then can either further all messages to one single computer as a part of a conversation/user-session, or interleave between the available workers. Here's the manual. As the video is encoded, it is streamed asynchronously over the message bus back. According to intel tests the bus itself should work well at high throughputs, but they had to use the interleaved tcp channel mode (they tested on a gigabit lan). Other users here have suggested FFlib. You might also look into having the workers convert into webM, but if FFlib works, that might be a lot easier. Each worker publishes over AMQP the next encoded video piece. A server-running program, e.g. the router program I talked about before, starts sending to the client (see no. 4)
  4. Have a client-program, e.g. silverlight/flash connect (for example over the same socket that you opened for client->server data, or over HTTP), and read the byte-stream with a decoder. Render the output.