I have a question on streaming output of libx264. My scenario is that Iam capturing video from webcam, encoding with x264 and then streaming data to flash, muxed as FLV. For muxing, Im using output/flv_bitstream.h, included in libx264 budle. The only modification of muxer, that I made, is that instead of fwrite() im usig send() to transfer data via socket... Encoding library is working fine. If I save output (even muxed), vlc player is able to play it. But, when it goes to data transfer via socket, vlc and flash are not cooperating. The weird thig is, that if Im sending data to vlc player thru socket, it waits till transmission end and then plays video from buffer. But what I need is to play live stream.
I also tryed to read flv file and send it to vlc of flash tag by tag and it is working fine.
Any suggestions?
Implement a simple http server and respond to incoming requests with:
"HTTP/1.0 200 OK\r\n"
"Pragma: no-cache\r\n"
"Content-Type: video/x-flv\r\n"
"\r\n"
Each of these should be followed by the raw FLV bit-stream.
This should enable live consumption of the content using eg. VLC, flowplayer, ...
Also, consider using 'url_open_dyn_buf'/'url_close_dyn_buf' rather than 'fwrite', see ffserver for reference.
Nadav at Sophin
Related
I have live audio and video data that I can either send as an Ogg or Matroska (WebM) stream. I also have dynamic metadata that will be sent from the server to the client, and should be correlated with the A/V streams. For example, the exact server time when the audio packet was recorded.
I attempted to hack this with Kate encoding, but that appears to send all the subtitle data at once in the beginning, and not dynamically as it happens. If you could tell me how to send well-correlated dynamic subtitle data, then that's a viable solution as well.
I need to stream webm video to browser from my video server.
The video server (C++) receives vp8 encoded frame packets of webcam or screen from the client with .ivf headers like <4_bytes_data_size><8_bytes_pts><vp8_encoded_data>. Also I send 4 bytes of total packet duration before the rest of data, so the server knows the presentation timestamp, size and duration of each frame.
The question is: which headers should I use for frames in order for the browser to be able to play the stream in the <video> tag. Maybe there is some standard for webm real time streaming implementing?
PS: AFAIK the webm consists of EBML markup. If the same is used in <video> tag to parse the stream, could someone explain me what are the minimal set of EBML elements for video playback (no audio, just video)?
Video tag does not support ivf. Minimum webm requirement is whatever the minimum is to package your stream.
I have written an application which triggers an IP Camera to stream it's data (MPEG4) over RTP. This works fine so far - I start to setup and start the stream with the corresponding RTSP commands ( DESCRIBE, SETUP and PLAY ).
While streaming I receive the usual Sender Reports and send my own Receiver Reports - Everything is working fine here.
Now with the application mentioned above, I do NOT read the stream. I have a seperate hardware , which just logs all the stuff going over the Ethernet ( a little bit like Wireshark ). Now when the whole streaming is finished I can download those logs from my hardware and extract data from them.
So what I have then is a logfile with all the data from the RTP stream as raw data.
My question would now is: How do I write this appropriately into a MPEG4 file? I know this is a very broad question and I don't expect to get a step-by-step tutorial. But actually I am a bit overwhelmed and don't know where to start.If I just memcpy all the Payload from the RTP messages sequentially into a MPEG4 file it doesn't work. Now I am also a bit confused by SDP and stuff.
Well maybe someone has a link or some help for me..?
You should first read RFC3016, which describes the RTP format of MPEG-4 stream, then you'll know how to extract MPEG-4 frames from the RTP stream.
I actually changed from MPEG4 to H.264 - it actually was a little bit easier to write a video file like this. For H.264 this answer covers it pretty much:
How to process raw UDP packets so that they can be decoded by a decoder filter in a directshow source filter
Totally new to this! As the title says, I'm trying to serve a stream from OpenCV through Live555 using H.264 that is captured from a webcam.
I've tried something like:
#define LOCALADDRESS "rtsp://localhost:8081" // Address media is served
#define FOURCCCODEC CV_FOURCC('H','2','6','4') // H.264 codec
#define FPS 25 // Frame rate things run at
m_writer = cvCreateVideoWriter(LOCALADDRESS, FOURCCCODEC, FPS, cvSize(VIDEOWIDTH, VIDEOHEIGHT));
as reading a rtsp stream, is done similarly:
CvCapture *capture = cvCreateFileCapture(LOCALADDRESS);
which doesn't work so I'm turning to Live555. How do I feed a CvCapture encoded in H.264 to be served by Live555? There doesn't seem to be a straitforward way to serve a bytestream from one to another or perhaps I'm missing something.
There really isn't a straight-forward way I know of; certainly nothing that will happen in anything less than a few hundred lines of code.
I'm assuming you want to use an on-demand RTSP server (this is where the server's just sitting there, waiting for a client to connect, and then it starts streaming when the client establishes a connection and makes a request)? If so, this item in the Live555 FAQ applies.
However, Live555 is a weird (possibly misguided?) library, so it's unfortunately a bit more complicated than that. Live555 uses a single thread of operation with an event loop, so what you'll have to do is shove your raw bytestream into a buffer or queue, and then in your subsession class for streaming H.264, you'll check and see if there's available data in the queue and if so, pass it along. If not, schedule another check in a few milliseconds. You'll also need to strip off any NALU identifiers before you pass them along to live555.
We are interested in saving a H.264 stream and replaying it. Is there any one who experience saving h.264 using winpcap and replaying it. We were able to save H.263 and replay, but same logic does not work for H.264.
We also tried rtpdump tool to save H264 stream, but we were unable to replay it in that format?
thanks in advance
An H.264 stream is usually sent as a Transport Stream (TS). If you want to save it to file then you need to demux it and then mux it to a format suitable for file storage, for example MP4.
You will probably need to disable bframes in your encoder. Saving an RTP H.264 didn't work for me with bframes enabled.
I also advise to use a low keyint value because the dump will only be readable after the first keyframe.
You can use VLC to save the incoming stream with this command:
vlc -I rc rtp://#:4444 :sout=#std{access=file,mux=mp4,dst=output.mp4} :ipv4
Replace 4444 with the port number.