RTSP Client Window play RTSP server Camera - mfc

i'm writing a program RTSP client Window MFC , but i don't know how to start, can you give me example, document, source code link etc..
Thank a lot

Related

Gstreamer webrtc pipeline problem for open source camera

Hello everyone,
I am trying to implement low-latency video streaming using WebRTC. I write my code in C++ (websocket etc.), use only webrtc signalling server which is written in Python (ref1).
When I use a webcam, I do not have any problem streaming video to the client, however, I try to use the FLIR camera, I get a lot of problems while implementation.
There are a few questions in my mind to clear. I hope you guys give me some recommendations.
Is there any specific data-type that I should do pipeline to webrtc as a source? I just would like to know what kind of data I should send as a source in webrtc?
I try to send an image to check whether my WebRTC implementation works properly (except webcam), it gives me the error "Pipeline is empty". What can cause this problem? This is actually the main problem why I would like to know data type etc. to understand what exactly I should pipe into webrtc.
ref1: https://github.com/centricular/gstwebrtc-demos/tree/master/signalling
P.S.:
Client and Jetson Nano in the network
Server for signals is running on Jetson Nano
By running gst-inspect-1.0 webrtcbin you will find that both source and sink capability for this plugin is just application/x-rtp.
Therefore, if you want webrtcbin to work as a source pad, you will need to pipe it to some sort of RTP depayloader such as rtph264depay for video and rtpopusdepay for audio.

Can I use a video file as video source for WebRTC native code?

I have downloaded latest webrtc native code and tested the peerconnection example. In this example video can only come from devices configured on the system (it looks for devices in /dev/videoX).
I am wondering if it is possible to get the video from a video file at my local machine and pass its frames to VideoFrame in WebRTC so peerA, who is the peerconnection_client of the example. Then this video would be passed to peerB, who is a web client on browser.
Basically: My video source should be a video file.
If you accept to get away from Google's reference implementation, the streamer example of libdatachannel shows how to stream a video file to a browser.

Send play/pause command c++

So I'm trying to control my computer with an IR remote control. I get the IR code thanks to an arduino nano which sends it to my computer by the serial port.
I want to, for example depending on the code sent, send a play or pause or next and so on command to spotify or win media player or any other app.
I know that you have keyboards shortcuts for this so my question is how do I reproduce the commands that these keyboards shortcuts send to all the programs to stop the music ?
Thanks in advance for any help ! :)

WebRTC stream webcam browser to C/C++ native application

I have some troubles with the WebRTC API (and most particularly RTCPeerConnection).
I have successfully managed to make a video call between two browsers : I know how to get the webcam stream with getUserMedia, I know how to contact a STUN server and react on 'onicecandidate' event and I know how to create the offer (and the answer in the other peer) and send the sdp.
I use WebSockets as a signalling channel.
What I need to do is process the video stream with C/C++ algorithms, so I am looking for a way to receive a RTCPeerConnection in C/C++ and receive a call in C/C++.
I have been trying to build and test Google's libjigle library (haven't succeeded yet, though (I'm on Archlinux)). But even when I succeed, I don't see how to re-use the code for my own case.
What I have done so far :
I understand how STUN servers work, ICE candidates and SDP sessions and how to create / process them in javascript
I managed to make peer-to-peer calls between two browsers (and even between a PC and my Android and this worked perfectly)
I managed to use libwebsockets to create a simple signalling server, in which I successfully receive the browser's ICE candidates and sdp messages.
What I am looking for :
A way to receive/parse/process in C/C++ what the browser sends i.e. ICE candidates, sdp, offer
A way to create the answer (the browser will always be the initiator) and receive/process the webcam stream.
What I have tried :
I have tried to have the webcam play in a HTML5 element, and periodically (~ 33ms) draw the frame in a , call getImageData() and send the array of (R,G,B,alpha) with a pure WebSocket connection. But even for a 100x100 px, grayscale frame (hence 10kB), I can only achieve ~7fps with a ~600 kb/s upload stream. This is why I want to use RTCPeerConnection which works on UDP
My constraints :
I need to run the native app in C or C++ because I have image/video processing algorithms that are implemented in C++ (I have seen a lot of Node.js-based servers but I can't have that : no way to call my algorithms)
I'd like to be able to run at roughly 30 fps so that this is relatively fluid for the user.
I can't use Flash or Silverlight : I need to stay HTML5 / javascript for the client
Conclusion :
Where I fail short is everything that deals with ICE candidates, SDP sessions and contact STUN server in C/C++ because unlike javascript there are no events ('onicecandidates', 'onaddstream', etc).
Thank you in advance for your help !

Cross-platform method for streaming encrypted video over internet through server

I want to stream encrypted video (with audio) captured from a webcam from one computer to another over the internet with a server in between to forward the video. The program to display the encrypted video will be written in C++ and must be cross-platform; this program will have other functions, so I can't use a currently existing program. The program to stream the video from the webcam must run on Linux. I could use a program already available or write my own in C++.
Try VLC + encypted VPN. It's stable crossplatform solution.