Cross-platform method for streaming encrypted video over internet through server - c++

I want to stream encrypted video (with audio) captured from a webcam from one computer to another over the internet with a server in between to forward the video. The program to display the encrypted video will be written in C++ and must be cross-platform; this program will have other functions, so I can't use a currently existing program. The program to stream the video from the webcam must run on Linux. I could use a program already available or write my own in C++.

Try VLC + encypted VPN. It's stable crossplatform solution.

Related

Can I use a video file as video source for WebRTC native code?

I have downloaded latest webrtc native code and tested the peerconnection example. In this example video can only come from devices configured on the system (it looks for devices in /dev/videoX).
I am wondering if it is possible to get the video from a video file at my local machine and pass its frames to VideoFrame in WebRTC so peerA, who is the peerconnection_client of the example. Then this video would be passed to peerB, who is a web client on browser.
Basically: My video source should be a video file.
If you accept to get away from Google's reference implementation, the streamer example of libdatachannel shows how to stream a video file to a browser.

Recording audio output from application

I'm currently building a project on C++ using Visual Studio on Windows 8. This application captures video from camera and triggers some virtual animations in real-time, with some sounds being played along with the animations.
The user has the option to record the experience in video and sound. I already am able to record video, now I want to create a audio track of the sounds that are being played by the application, to later fuse both video and audio files.
So, which is the best way to record audio output from an application in windows?
Let me stress that I do NOT want to record audio from any input devices (such as a microphone), only from the application itself.
Best regards.
There is no recording of application output. If you generate audio on your own, you make a copy for the recording purposes, mix if you have multiple sources, and then use one of the APIs to produce a file depending on your preferences: directly writing a WAV file, Windows Media audio files (ASF/WMA), DriectShow, Media Foundation, third party libraries.
Real playback audio data is being mixed and sent for further playback. Sometimes you can enable loopback recording to capture fully mixed output (not just of specific application through) as if it is a capture from realtime audio input device.

Audio Streaming C++ Server / Client

I am actually working on a server-client multimedia player. This player can be a server to stream a MP3 file (or wma, wav, ogg, flac ...) over the network to another player (client).
I worked first on a basic network communication (client-server), that send and receive bits. But I have a problem : the audio encoding. I need a tool to encode the audio data to be able to send a little part of it through the network and let the client play it before the next part is coming.
I saw a few tools on internet such as BASS library, Live555 ... I used to work with PortAudio for student's projects but I hate it.
So basically, I need a tool to encode audio data (server side), (I can send it over lan), and decode data to play it (client-side).
Do you guys have some ideas about how to do it ? Which tool could be useful for me in that case?
PS : I am trying to use Qt library for the network interface (it is efficient, and it works on windows, linux, mac) ... Is there any audio streaming tool included in the Qt library ?
You can try FFMPEG. It can convert almost anything to anything (so it claims) and it is a widely used open source library.
We use it in our application mainly for decoding video/audio streams.

How to create a video streaming httpserver?

I'm using c++ and poco libraries. I'm trying to implement a video streaming httpserver.
Initially i used Poco::StreamCopier.
But client failed to stream.
Instead client is downloading the video.
How can i make the server to send a streamresponse so that client can stream the video in browser instead of downloading?
While not within POCO, you could use ffmpeg. It has streaming servers for a number of video protocols and is written in C (which you could write POCO-like adapters for).
http://ffmpeg.org/ffmpeg.html#rtp
http://ffmpeg.org/ffmpeg.html#toc-Protocols
http://git.videolan.org/?p=ffmpeg.git;a=tree
And it has a pretty liberal license:
http://ffmpeg.org/legal.html
You need to research which video encoding and container that is right for streaming -- not all video files can stream
Without using something to decode the video at the other end but simply over HTTP, you can use The mime encoding "content-type:multipart/x-mixed-replace; boundary=..." and send a series of jpeg images.
This is actually called M-JPEG over HTTP. See: http://en.wikipedia.org/wiki/Motion_JPEG
The browser will replace each image as it receives it which makes it look like it's video. It's probably the easiest way to stream video to a browser and many IP webcameras support this natively.
However, it's not bandwidth friendly by any means since it has to send a whole jpeg file for each frame. So if you're going to be using this over the internet it'll work but will use more bandwidth than other method.
However, It is naively supported in most browsers now and it sounds like that is what you're after.

How can I stream video from my application to the web?

I have an application that grabs video from multiple webcams, does some image processing, and displays the result on the screen. I'd like to be able to stream the video output on to the web - preferably to some kind of distribution service rather than connecting to clients directly myself.
So my questions are:
Do such streaming distribution services exist? I'm thinking of something like ShoutCAST relays, but for video. I'm aware of ustream.tv, but I think they just take a direct webcam connection rather than allow you to send any stream.
If so, is there a standard protocol for doing this?
If so, is there a free library implementation of this protocol for Win32?
Ideally I'd just like to throw a frame of video in DIB format at a SendToServer(bitmap) function, and have it compress, send, and distribute it for me ;)
Take a look at video LAN client (or VLC for short) as a means for streaming video.
As for distribution sites, I don't know how well it works with ustream.tv and similar new services.
ustream.tv works by using Adobe Flash's support for reading input from a webcam. To fake it out, you need a fake webcam driver. Looking on the ustream.tv site, they point to an application called WebCamMax that allows effects and splicing in video. It works by creating a pseudo-webcam that mixes video from one or more cameras along with other sources. Since that app can do it, your own code could do that too, although you'll probably need to write a Windows driver to get it all working correctly.