--EDIT--
I wasn't very well understood with the initial question, so allow me to rephrase.
I am working in an image processing application for Android.
Let's admit I will send an image from android to some server.
What I want to know is how to process this image with opencv (c/c++) on the server and return the results to mobile.
Look into setting up a web service if you're just trying to offload the processing to a server and send back some processed data. There's a ton of examples and sample setups based on the server environment (OS, speed, bandwidth needs, etc) out there that should help you get started. You would then setup the OpenCV environment on the server, and perform all of your processing through those libraries. We would need more information on what type of image processing you hope to accomplish to help you more, but again there are lots of examples for OpenCV and great documentation as well. The Android side will depend on how you setup the web service, so based on that choice there are different solutions available for easily interfacing with your server.
Related
I am working on a face recognition project using Flask as my web server running on a Ubuntu 14.04 Machine. I am using OpenCV 2.4.9 as my image processing software which is written using Python2.7. I would like to be able to access a clients webcam through their browser to capture a image or frame from the webcam stream and send it back to the server to be processed. Is there an easy way using python to obtain access to the clients webcam or is it possible to use JavaScript in conjunction with my current code.
I'll assume that you are more interested in architectural decisions for you application that specific implementation details. You will need to use client side and server side for this application.
Client side is html page with javascript that will capture images from web cam. There are many resources on internet about this topic. This article explains how it works with some examples. I would recommend to use some javascript library like this one
The next thing is to decide how client application and server side transfers image data. In case you would like to stream webcam video to server, do some computation and stream data back to client application, WebSockets are your friend. This tutorial describes how to set up flask application for websockets.
Much easier approach is to POST image data to the server, do some computation and respond to client. Downside of this approach is that it's not suitable for continuous video processing. But you can use it for single video frame processing. Otherwise you would flooded your server with requests.
The last thing to decide is how much processing is done to images on server side. If you would do some extensive computation that takes long time, I would recommend celery for background tasks. HOWEVER this would change architecture considerably.
For a proof of concept, I would recommend following. Take single image with webcam, post it to server, do quick computation on image and respond with what you've had computed.
Good luck.
I have seen several examples of native to browser WebRTC applications, like for streaming video files stored on a server to one or more browsers, but is it possible to do the reverse ? I.e. streaming the webcam from the browser to a server, written in C, C++, Java or other ?
It is possible.
WebRTC is using open standards to stream content over the network. You can find all the details in the following RFCs:
http://tools.ietf.org/wg/rtcweb/
If you want to write your own native application that will receive (and even send) WebRTC media you can either get the WebRTC native code from here: http://www.webrtc.org/webrtc-native-code-package and build it into your solution or alternatively use one of the existing SDKs that can provide you this functionality (depending on which platform you want your native application to run on).
If you want to connect WebRTC to existing hardware like a SIP desk phone, you will need to have some sort of a gateway that will have one leg that will communicate with WebRTC on the browser and the other leg that will communicate with your SIP phone.
There are a lot of commercial solutions already out there, but eventually it all comes down to what your needs are.
I have been trying to solve this for 2 weeks and I have not been able to reach a solution.
Here is what I am trying to do:
I need a web application in which users can upload a video; the video is going to be transformed using opencv's python API. Since I have Python's API for opencv I decided to create the webapp using Django. Everything is fine to that point.
The problem is that the video transformation is a very long process so I was trying to implement some real time capabilities in order to show the user the video as it is transformed, in other words, I transform a frame and show it to the user inmediatly. I am trying to do this with CoffeScript and io sockets following some examples; however I havent been successful.
My question is; what would be the right approach to add real time capabilities to a Django application ?
I'd recommend using a non-django service to handle the websockets. Setting up websockets properly is tricky on both the client and server side. Look at pusher.com for a free/cheap solution that will just work and save you a whole lot of hassle.
The initial request to start rendering should kick off the long-lived process, and return with an ID which is used to listen to the websocket for updates.
Once you have your websockets set up, you can send messages to the client about each finished frame. Personally I wouldn't try to push the whole frame down the websocket, but rather just send a message saying the frame is done with a URL to get the frame. Then normal HTTP with its caching and browser niceties moves the big data.
You're definitely not choosing the easy path. The easy path is to have your long-lived render task update the render state in the database, and have the client poll that. Extra server load, but a lot simpler.
Django itself really is focused on doing one kind of web interface, which is following the HTTP Request/Response pattern. To maintain a persistent connection with clients, which socket.io really makes dead simple, you need to diverge a bit from a normal Django installation.
This article discusses the issue of doing real-time with Django, with the help of Orbited and Twisted. It's rather old, and it relies on Comet, which is not the preferred way of doing real-time these days.
You might benefit a lot by going for Socket.io on the client, and something like Tornado (wiki) + Tornado client for Socket.io. But, if you really want to stick with Django for the web development (which Tornado also provide), you would need to make the two work together internally, each handling their particular use case.
Finally, this other article discusses how to make Django work with gevent (an coroutine-based networking library for Python) and Socket.io, which might well be your best option otherwise.
Don't hesitate to post questions/comments as they pop up!
I want to create a service that streams live traffic video to either a client browser or a client processor (which will actually process the video). I want real video, not just images that update periodically. Assume I know the basic concepts of web design (both front and back end). But assume I know nothing about streaming media.
Can someone point me in the direction to get started?
I need information concerning software, frameworks (especially if it's compatible with Ruby on Rails), encoders, converters, protocols, ... - thanks!
What about something like tokbox?
http://www.tokbox.com/
I havent personally used it. However I have visited an on line video podcast that uses this technology. Good quality streaming.
So we have some server with some address port and ip. we are developing that server so we can implement on it what ever we need for help. What are standard/best practices for data transfer speed management between C++ windows client app and server (C++)?
My main point is in how to get how much data can be uploaded/downloaded from/to client via his low speed network to my relatively super fast server. (I need it for set up of his live stream Audio/Video bit rate)
My try on explaining number 3.
We do not care how fast is our server. It is always faster than needed. We care about client tyring to stream out to our server his media. he streams encoded (via ffmpeg) live video data to our server. But he has say ADSL with 500kb/s of outgoing traffic. Also he uses some ICQ or what so ever so he has less than 500 kb/s per second. And he wants to stream live video! So we need to set up our ffmpeg to encode video with respect to the bit rate user can provide. We develop server side and client side. We need a way of finding out how much user can upload per second currently (so value can change dynamically over time)
Check this CodeProject Article
it's dot-net but you can try figure out the technique from there.
I found what I wanted. "thrulay, network capacity tester" A C++ code library for Available bandwidth tracking in real time on clients. And there is "Spruce" and it is also oss. It is made using some of linux code but I use Boost library so it will be easy to rewrite.
Offtop: I want to report that there is some group of people on SO down voting on all questions on this topic - I do not know why they are so angry but they deffenetly exist.