Still pretty new to openCV/C++ so please bear with me :)
I am currently trying to find a good (and possibly easy) way to stream my camera frames in realtime (or almost realtime) from my OpenCV application so that I can open my browser, type in the IP and see the image.
So far I got the server done using winsock2 (if anyone has a good cross-platform alternative and can tell me what's different, I'd be quite glad) and can connect to it via entering the IP in my browser.
Socket-/Server-Code:
//socket
long rc;
SOCKET acceptSocket;
SOCKADDR_IN addr;
WSADATA wsa;
// initialize winsock
rc=WSAStartup(MAKEWORD(2,0),&wsa);
if(rc!=0)
{
printf("Error: startWinsock, Errorcode: %d\n",rc);
return 1;
}
else
{
printf("Winsock initialized!\n");
}
// create Socket
acceptSocket=socket(AF_INET,SOCK_STREAM,0);
if(acceptSocket==INVALID_SOCKET)
{
printf("Error: Socket-Creation failed, Errorcode: %d\n",WSAGetLastError());
return 1;
}
else
{
printf("Socket succesfully created!\n");
}
memset(&addr,0,sizeof(SOCKADDR_IN));
addr.sin_family=AF_INET;
addr.sin_port=htons(8080);
addr.sin_addr.s_addr=ADDR_ANY;
rc=bind(acceptSocket,(SOCKADDR*)&addr,sizeof(SOCKADDR_IN));
if(rc==SOCKET_ERROR)
{
printf("Error: bind, Errorcode: %d\n",WSAGetLastError());
return 1;
}
else
{
printf("Socket bound to port 8080\n");
}
rc=listen(acceptSocket,10);
if(rc==SOCKET_ERROR)
{
printf("Error: listen, Errorcode: %d\n",WSAGetLastError());
return 1;
}
else
{
printf("acceptSocket is now in listen mode...\n");
}
SOCKET connectedSocket=accept(acceptSocket,NULL,NULL);
if(connectedSocket==INVALID_SOCKET)
{
printf("Error, Errorcode: %d\n",WSAGetLastError());
return 1;
}
else
{
printf("New connection accepted!\n");
}
As for the sending part I tried using the camera-frame directly and saving it as jpg + reloading the jpg so far:
char filename[128];
frame_count++;
if (frame_count%50 == 0)
{
sprintf(filename, "frame_%06d.jpg", index);
imwrite(filename, camera1_undist);
Mat image = imread(filename, CV_LOAD_IMAGE_COLOR);
send(connectedSocket, (const char *) image.data, image.total()*image.elemSize(), 0);
frame_count = 0;
index++;
}
Questions:
1) The image isn't shown as image but as numbers/symbols (HEX/ASCII? mostly black questionmarks and stuff). How do I have to convert/change what I send to actually show the image?
2) I read about MJPEG and found a way to save the output but I have no idea how to use that output file any further atm. How do I use it without MJPEG-Streamer (as it is Linux only)
Thanks
I got it done myself now, was making a few mistakes here and there.
The most important part to get the image to show up was
a) to get a html header which is sent before sending the actual image so my browser knows what's gonna be sent
b) use imencode to save the image in a buffer and send that buffer instead of the image itself
I also don't save the frame anywhere but just take it right from my camera input so that kinda shortens it as well.
To get it done with mjpegs I just had to add another header before, that tells the "client" that there's gonna be multiple images being sent, seperated by a certain border:
sprintf(head, "HTTP/1.1 200 OK\r\nContent-Type: multipart/x-mixed-replace;boundary=informs\r\n\r\n");
send(socket,head,strlen(head), 0);
This helped a little, esp. with the header-part: http://nakkaya.com/2011/03/23/streaming-opencv-video-over-the-network-using-mjpeg/ (although it got me irritated at first 'cause I have never used nor seen clojure before.
Also this was really helpful: http://answers.opencv.org/question/6976/display-iplimage-in-webbrowsers/