How to create an RTSP server using live 555 on a mac - c++

I am trying to find a method in the live555 class that takes in a
video file and turns it into a rtsp stream. Could someone please
tell me about this method?
int main()
{
scheduler = BasicTaskScheduler::createNew();
}

Related

OpenCV cannot connect to video stream - lack of some codec?

I use application IPCamera on my mobile phone with Android to output (share) video image from it's camera to LAN. I can access it on PC browser - that is ok.
However, I want to make OpenCV capture this video stream from IP address by typing
VideoCapture cap("http://admin:admin#192.168.0.11:8081/?action=stream?dummy=param.mjpg");
while( cap.isOpened() )
{
Mat frame;
if ( ! cap.read(frame) )
break;
cout << "Connected!!";
imshow("lalala",frame);
int k = waitKey(10);
if ( k==27 )
break;
}
and i got error:
Actual codec, which is used by phone is mjpeg (i read it from application on my mobile). I don't know if OpenCV supports this, but is that about mobile application uses some kind of unique codec, or my PC lacks it, or maybe C++/OpenCV code is wrong?
On PC opencv can capture your video stream from your mobile prone..
Like. You are using right connection string, like this for rtsp stream in my case.
VideoCapture capture("rtsp://USER:PASS#xxx.xxx.xxx.xxx/axis-media/media.amp?camera=2");
Probably, You don't have FFMPEG instaled corectly. You need to reinstall Opencv. First you need to install FFMPEG and Opencv After that.
In opencv 3.0.0 and 3.1 try to add
#include <opencv2\videoio.hpp>
#include <opencv2\imgcodecs.hpp>
Some tips how to install ffmpeg and sample in C++ on linux debian Here Code and tips and tricks

Camera connected, but nothing happened, openCV-IP Webcam Android

Here's the deal, I'm trying to interface my S3 as webcam, using IP WebCam app for android, then making a IP webcam within the software, usually the address is http://192.168.1.XX:8080/greet.html maybe the last two digits changes , the webpage give me options and info like this:
"Here is the list of IP Webcam service URLs:
http://192.168.1.XX:8080/video is the MJPEG URL."
The code I'm using is simply like this:
include "opencv2/highgui/highgui.hpp
include "opencv2/imgproc/imgproc.hpp
using namespace cv;
int main(){
VideoCapture cap("http://192.168.1.XX:8080/video.mjpg"); // connect to an ip-cam ( might need some additional dummy param like: '?type=mjpeg' at the end
while(cap.isOpened()){
Mat frame;
if (!cap.read(frame))
break;
imshow("lalala",frame);
int k = waitKey(10);
if ( k==27 )
break;
}
return 0;
}
So the IP WebCam app recognice a connection but there's no image whatsoever... and then it says:
warning: Error opening file <../../modules/highgui/src/cap_ffmpeg_imp
Cannot open the web cam
Process returned -1 <0xFFFFFFF> execution time: 37.259 s
Press any key to continue.
I am using:
Windows 7 Professional
Open CV 2.4.4
Codeblocks 13.12
USB 2.0 webcam 640x480 at 30fps, 50 Hz and all standard.
Try to connect another video streaming android application.
I use Smart WebCam.
open it with
cap.open("http://192.168.1.13:8080/?x.mjpg);

Live555 to stream live video and audio in one RTSP stream

I have been able to stream video using live555 on its own as well as audio to stream using live555 on its own.
But I want to have the video and audio playing on the same VLC. My video is h264 encoded and audio is AAC encoded. What do I need to do to pass these packets into a FramedSource.
What MediaSubsession/DeviceSource do I override, as this is not a fixed file but live video/live audio?
Thanks in advance!
In order to stream video/H264 and audio/MPEG4-GENERIC in the same RTSP unicast session you should do something like :
#include "liveMedia.hh"
#include "BasicUsageEnvironment.hh"
int main()
{
TaskScheduler* scheduler = BasicTaskScheduler::createNew();
BasicUsageEnvironment* env = BasicUsageEnvironment::createNew(*scheduler);
RTSPServer* rtspServer = RTSPServer::createNew(*env);
ServerMediaSession* sms = ServerMediaSession::createNew(*env);
sms->addSubsession(H264VideoFileServerMediaSubsession::createNew(*env, "test.264",false));
sms->addSubsession(ADTSAudioFileServerMediaSubsession::createNew(*env, "test.aac",false));
rtspServer->addServerMediaSession(sms);
}

Set rtsp_flags to listen ffmpeg in c code

I am trying to create a client client server application to stream and then receive video using rtsp using ffmpeg libraries. I am done with the client part which is streaming the video and i can receive the video on ffplay using following command
ffplay -rtsp_flags listen rtsp://127.0.0.1:8556/live.sdp
My problem is that i want receive the video in a c code and i need to set rtsp_flags option in it. Can anyone plz help??
P.S. i cannot use ffserver because i am working on windows and ffserver is not available for windows as far as i knw
You need to add the option when opening the stream:
AVDictionary *d = NULL; // "create" an empty dictionary
av_dict_set(&d, "rtsp_flags", "listen", 0); // add an entry
//open rtsp
if ( avformat_open_input( &ifcx, sFileInput, NULL, &d) != 0 ) {
printf( "ERROR: Cannot open input file\n" );
return EXIT_FAILURE;
}

How to use FFMPEG to play H.264 stream from NAL units that are stored as video AVPackets

I am writing client-server system that uses FFMPEG library to parse H.264 stream into NAL units on the server side, then uses channel coding to send them over network to client side, where my application must be able to play video.
The question is how to play received AVPackets (NAL units) in my application as video stream.
I have found this tutorial helpful and used it as base for both server and client side.
Some sample code or resource related to playing video not from file, but from data inside program using FFMPEG library would be very helpful.
I am sure that received information will be sufficient to play video, because I tried to save received data as .h264 or .mp4 file and it can be played by VLC player.
Of what I understand from your question, you have the AVPackets and want to play a video. In reality this is two problems; 1. decoding your packets, and 2. playing the video.
For decoding your packets, with FFmpeg, you should take a look at the documentation for AVPacket, AVCodecContext and avcodec_decode_video2 to get some ideas; the general idea is that you want to do something (just wrote this in the browser, take with a grain of salt) along the lines of:
//the context, set this appropriately based on your video. See the above links for the documentation
AVCodecContext *decoder_context;
std::vector<AVPacket> packets; //assume this has your packets
...
AVFrame *decoded_frame = av_frame_alloc();
int ret = -1;
int got_frame = 0;
for(AVPacket packet : packets)
{
avcodec_get_frame_defaults(frame);
ret = avcodec_decode_video2(decoder_context, decoded_frame, &got_frame, &packet);
if (ret <= 0) {
//had an error decoding the current packet or couldn't decode the packet
break;
}
if(got_frame)
{
//send to whatever video player queue you're using/do whatever with the frame
...
}
got_frame = 0;
av_free_packet(&packet);
}
It's a pretty rough sketch, but that's the general idea for your problem of decoding the AVPackets. As for your problem of playing the video, you have many options, which will likely depend more on your clients. What you're asking is a pretty large problem, I'd advise familiarizing yourself with the FFmpeg documentation and the provided examples at the FFmpeg site. Hope that makes sense