This should be the classic simple error which I can't really find..
I am using libvlc from a VS2010 C++ project. I followed these steps to record from a webcam and streaming it through RTSP:
1)
inst = libvlc_new (1, myargs);
where myargs just contain the plugin path
2)
libvlc_vlm_add_broadcast(inst, "mybroadcast", "dshow://", "#transcode{vcodec=h264,vb=0,scale=0,acodec=mp4a,ab=128,channels=2,samplerate=44100}:rtp{sdp=rtsp://:5544/}", 0, NULL, TRUE, 0);
3)
libvlc_vlm_play_media(inst, "mybroadcast");
4) Sleep for a while, since libvlc uses threads I can be sure this will not interfere.
The error log says:
live555 debug: connection timeout
live555 error: Failed to connect with
rtsp://192.168.1.100:5544
Where am I getting wrong?
Please don't point me out with the oxygen documentation, I already read it thousands of times and it really doesn't contain the answer. There was a link about streaming options but it is now broken on the vlc developer wiki
I am asking for help, please
I found the solution: the URL was malformed.. a "/" at the end of the URL was needed
Related
I’m working on some C++ project which depends on Wi-Fi RAK5206 electronic board. I’m using ffmpeg library to obtain video and audio stream and I have issue where I can start and stop stream for four times, but when I want to start for the fifth time I get error. Error description is Invalid data found when processing input and it happens when I call avformat_open_input function and I need to restart the electronic board, reconnect to Wi-Fi etc.
I figured out with Wireshark application that VLC is working and it is sending some BYE packets when TEARDOWN is called. I wonder if error depends to them, because from my application I’m not sending. How I can make setup to force ffmpeg to send BYE packets?
I found some declarations in rtpenc.h file which options to set and tried when I want to connect, but obviously without success.
The code that I used for setting options and opening input:
AVDictionary* stream_opts = 0;
av_dict_set(&stream_opts, "rtpflags", "send_bye", 0);
avformat_open_input(&format_ctx, url.c_str(), NULL, &stream_opts);
Make sure you are calling this av_write_trailer function, from your application.
if not please debug and check it.
/* Write the trailer, if any. The trailer must be written before you
* close the CodecContexts open when you wrote the header; otherwise
* av_write_trailer() may try to use memory that was freed on
* av_codec_close(). */
av_write_trailer(oc);
function Call flow code snippet from ffmpeg source:
av_write_trailer ->
....
ret = s->oformat->write_trailer(s);
} else {
s->oformat->write_trailer(s);
}
...
.write_trailer = rtp_write_trailer ->
...
if (s1->pb && (s->flags & FF_RTP_FLAG_SEND_BYE))
rtcp_send_sr(s1, ff_ntp_time(), 1)
Resolved issue with adding flag 16 (binary: 10000) to AVFormatContext object's flag.
formatCtx->flags = formatCtx->flags | 16;
According to rtpenc.h:
#define FF_RTP_FLAG_SEND_BYE 16
Live555MediaServer can be used to stream video files as rtsp streams. I have 2 clients (vlc) that connect to the server, A and B. I want to see the exact video stream in both the clients. Here is the problem: I connect A and after 10 seconds I connect B. When B is connected the video that I see starts over from the beginning, while A keeps streaming as it was.
I would like the 2 concurrent streams to be synchronized.
The live555 doc says that setting reuseFirstSource to True should work. So I tried to set reuseSource to true at DynamicRTSSPServer:121 but it didn't work. When I connect to the server using client B the video restarts from the beginning.
Boolean const reuseSource = True;
I expect to see the 2 concurrent streams synchronized even if one starts with a delay with respect to the other one.
I finally found a workaround and why there was this 'bug'.
Quick answer: set if condition at line 67 to false, i.e.
if (smsExists && isFirstLookupInSession) {
becomes
if (false) {
Explaination: Every time a new session is starting, the isFirstLookupInSession variable is set to true and the session is removed and recreated.
I wrote to the support of live555 and Finlayson told me and I quote
“LIVE555 Media Server” code was always intended to work this way, and was intended to be a ‘stand-alone appliance’ that does not have its code modified (e.g., by changing the value of “reuseFirstSource”).
Thus the only solution for creating a RTSP server through Live555 is to create your own server starting from the testProgs examples.
The workaround proposed here could generate unwanted behaviors, but for a simple rtsp server with multiple streams it's fine.
Update: I am using sricam SP019 IP(Wireless) camera.
I have been able to find the RTSP URL for my camera: "rtsp://IP_ADDRESS:554/onvif1" and also managed to play it in VLC and the onvifer Android app provided.
The app also provided the following info -
- Encoding: H264
- Transport Protocol: RTP/RTSP/TCP
- RTP packets received: some non-zero number
- RTP packets lost: 0
- RTSP port: 554
However, I still keep getting the error shown below.
===========================================
I am currently working on a project that requires me to interface with an IP camera (Company name: sricam) using openCV 3.3.1.
Already tried:
I have posted in the openCV forum (here) but have not received any reply yet. I also tried all options in this but keep getting this error related to the Gstreamer library.-
My question:
It would be extremely helpful if someone can just point me in the right direction as a minimum.
Thanks!
When it comes to camera URL there should be some default value in documentation (but it might be changed on configuration of camera). I guess that it will be best to start looking there.
Did you try looking on this page?
https://www.ispyconnect.com/man.aspx?n=Sricam
Try like this.
It worked to me ( OSX, sricam sp005 )
import os
os.environ["OPENCV_FFMPEG_CAPTURE_OPTIONS"] = "rtsp_transport;udp"
vcap = cv2.VideoCapture("rtsp://[IP_CAM_ADD]", cv2.CAP_FFMPEG)
Hope to be helpful to somebody
I have searched all around and can not find any examples or tutorials on how to access a webcam using ffmpeg in C++. Any sample code or any help pointing me to some documentation, would greatly be appreciated.
Thanks in advance.
I have been working on this for months now. Your first "issue" is that ffmpeg (libavcodec and other ffmpeg libs) does NOT access web cams, or any other device.
For a basic USB webcam, or audio/video capture card, you first need driver software to access that device. For linux, these drivers fall under the Video4Linux (V4L2 as it is known) category, which are modules that are part of most distros. If you are working with MS Windows, then you need to get an SDK that allows you to access the device. MS may have something for accessing generic devices, (but from my experience, they are not very capable, if they work at all) If you've made it this far, then you now have raw frames (video and/or audio).
THEN you get to the ffmpeg part - libavcodec - which takes the raw frames (audio and/or video) and encodes them into a streams, which ffmpeg can then mux into your final container.
I have searched, but have found very few examples of all of these, and most are piece-meal.
If you don't need to actually code of this yourself, the command line ffmpeg, as well as vlc, can access these devices, capture and save to files, and even stream.
That's the best I can do for now.
ken
For windows use dshow
For Linux (like ubuntu) use Video4Linux (V4L2).
FFmpeg can take input from V4l2 and can do the process.
To find the USB video path type : ls /dev/video*
E.g : /dev/video(n) where n = 0 / 1 / 2 ….
AVInputFormat – Struct which holds the information about input device format / media device format.
av_find_input_format ( “v4l2”) [linux]
av_format_open_input(AVFormatContext , “/dev/video(n)” , AVInputFormat , NULL)
if return value is != 0 then error.
Now you have accessed the camera using FFmpeg and can continue the operation.
sample code is below.
int CaptureCam()
{
avdevice_register_all(); // for device
avcodec_register_all();
av_register_all();
char *dev_name = "/dev/video0"; // here mine is video0 , it may vary.
AVInputFormat *inputFormat =av_find_input_format("v4l2");
AVDictionary *options = NULL;
av_dict_set(&options, "framerate", "20", 0);
AVFormatContext *pAVFormatContext = NULL;
// check video source
if(avformat_open_input(&pAVFormatContext, dev_name, inputFormat, NULL) != 0)
{
cout<<"\nOops, could'nt open video source\n\n";
return -1;
}
else
{
cout<<"\n Success !";
}
} // end function
Note : Header file < libavdevice/avdevice.h > must be included
This really doesn't answer the question as I don't have a pure ffmpeg solution for you, However, I personally use Qt for webcam access. It is C++ and will have a much better API for accomplishing this. It does add a very large dependency on your code however.
It definitely depends on the webcam - for example, at work we use IP cameras that deliver a stream of jpeg data over the network. USB will be different.
You can look at the DirectShow samples, eg PlayCap (but they show AmCap and DVCap samples too). Once you have a directshow input device (chances are whatever device you have will be providing this natively) you can hook it up to ffmpeg via the dshow input device.
And having spent 5 minutes browsing the ffmpeg site to get those links, I see this...
I'm using the OSMF's Strobe Media Playback player to try and play files from AWS Cloudfront/S3
The bucket is called ct.recorder. The cloudfront distribution is called 1dm7svtk8jb00c.cloudfront.net, and it's origin is ct.recorder.
The video within the bucket is called vid_test001
I've tried initializing the player with rtmp://s34osaecrafusl.cloudfront.net/cfx/st/vid_test001
But that doesn't work.
I get Connection attempt rejected by FMS server. Connection failed.
I've also tried it with .flv at the end, but that doesn't work either.
Am I not linking to the file properly, or is it my player?
Well, I had an entire answer written up, speculating that it was related to bucket permissions, and now I'm scratching that answer and posting this, instead. :)
$ rtmpdump -r rtmp://s34osaecrafusl.cloudfront.net/cfx/st/vid_test001.flv -o testfile.flv
RTMPDump v2.4
(c) 2010 Andrej Stepanchuk, Howard Chu, The Flvstreamer Team; license: GPL
Connecting ...
WARNING: HandShake: client signature does not match!
INFO: Connected...
Starting download at: 0.000 kB
INFO: Metadata:
INFO: duration 13.82
INFO: videocodecid 2.00
INFO: audiocodecid 6.00
INFO: canSeekToEnd FALSE
INFO: createdby AMS 5
INFO: creationdate Tue Dec 03 13:41:46 2013
1190.238 kB / 13.82 sec (100.0%)
Download complete
This actually works for me... both with, and without, the .flv on the end, and the resulting file is a 7 second video of a guy looking at a webcam.
Using "smplayer" for Windows, I can connect to cloudfront with the rtmp:// url and stream the video, but it only works without the .flv on the end, using:
MPlayer Redxii-SVN-r36243-4.6.3 (C) 2000-2013 MPlayer Team
Custom build by Redxii, http://smplayer.sourceforge.net
Compiled against FFmpeg version N-52798-gf5846dc
Build date: Sun May 5 23:51:25 EDT 2013
This doesn't quite answer your question of why it isn't working, except to say that your player seems to be lying to you as far as "Connection attempt rejected by FMS server" because, at least from here, it's good, except for this part, and I don't know what it means.
WARNING: HandShake: client signature does not match!
However, that could just be a distraction.
It looks as if it's going to be your player... so trying other players would be worthwhile.
It is, of course, possible, that there's a regional issue involving the particular edge location inside cloudfront that you access from your location, which could be significantly different than the one I'm hitting, since it's geographically... but if another player works where you are, then you may have the answer you're looking for. Firing up wireshark and analyzing the protocol exchange could be an interesting exercise also.
Afterthought: the extra slash in your path could also be blowing something's mind, since an RTMP url apparently consists of two distinct components, "application"/"stream_name" and the point of delineation may be ambiguous at some level to some component in the chain. If cloudfront thinks the "application" is "cfx" and the stream is "st/vid_test001" but the client assumes the "application" is "cfx/st" with stream name "vid_test001" it seems like there could be some potential for interoperability trouble there. This is wild speculation, but perhaps worth experimentation, too.
The embed parameter urlIncludesFMSApplicationInstance needs to be set to true.