GStreamer send 16 raw video over rtp - c++

I've a 16bit greyscale video stream from a LWIR (thermal camera) and I want to forward the stream over RTP without any compression.
gstreamer format is: video/x-raw,format=GRAY16_LE,width=640,height=520,framerate=9/1
But I can't find any plugin to transmit the data over RTP.
https://gstreamer.freedesktop.org/documentation/rtp/index.html?gi-language=c
Do you have an idea?
Thanks, Martin

Check for the specs of uncompressed video data over RTP:
https://www.rfc-editor.org/rfc/rfc4175
As you will notice your specific format is not covered by the specification.

Related

Is it possible to use libx264 to convert H264 raw data to a image(PNG/JPEG) without ffmpeg?

I received some video data via RTP / RTSP / SIP, the data is encoded by H264 and sent by a IP camera. I would like to convert H264 keyframe data into a picture and analyze whether it contains faces. I do not want to use FFMPEG such a huge library, just use libx264 and opencv can do it? How?
Thanks.
No, not possible. X264 can not decode (it is a h264 encoder only). It also can not encode jpeg/png. Ffmpeg is what you need. If it is too large, custom compile including only the features you need. And static link so unused functions are striped out.

Why my App cannot decode the RTSP stream?

I use live555 to receive RTP video frame (frame encoded in H264). I use Live555 open my local .sdp file to receive frame data. I just saw DummySink::afterGettingFrame was called ceaselessly。 if fReceiveBuffer in DummySink is correct, Why FFMPEG cannot decode the frame? My code is wrong?
Here is my Code Snippet:
http://paste.ubuntu.com/12529740/
the function avcodec_decode_video2 is always return failed , its value less than zero
fReceiveBuffer is present one video frame?
Oh, Here is my FFMPEG init code need to open related video decoder:
http://paste.ubuntu.com/12529760/
I read the document related H264 again, I found out that I-frame(IDR) need SPS/PPS separated by 0x00000001 insert into the header and decoder have a capacity to decode the frame correctly. Here is a related solution
FFmpeg can't decode H264 stream/frame data
Decoding h264 frames from RTP stream
and now, My App works fine, it can decode the frame and convert it to OSD Image for displaying to screen .

FFMPEG get jpeg data buffer from mjpeg stream?

I'm using FFMPEG to decode video stream from IP Camera, I have the example code that can decode video stream with any codec into YUI frames format.
But my case is special, I will describe as below
The IP camera stream is MJPEG, and I want using FFMPEG to decode, but I don't want to decode frame into YUV, I want to decode frame under jpeg format, and save those jpeg buffer into image files (*.jpg).
So far, I can do it by converting YUV frame (after decoding) to Jpeg, but this will cause bad performance. Since video stream is MJPEG, I think I can get jpeg data before decoding to YUI, but I don't know how to do it.
Someone can help me?
Many thanks,
T&T

How to use live555 streaming media forwarding

I use Live555 h.264 stream client to query the frame packets from an IP camera, I use ffmpeg to decode the buffer and analysis the frame by OpenCV.(those pipeline are based on testRTSPClient sample, I decode the h.264 frame buffer in DummySink::afterGettingFrame() by ffmpeg)
And now I wanna stream the frame to another client(remote client) OnDemand mode in real-time, the frame may added the analysis result(boundingboxs, text, etc), how to use Live555 to achieve this?
Well, your best bet is to re-encode the resultant frame (with bounding boxes etc), and pass this to an RTSPServer process which will allow you to connect to it using an rtsp url, and stream the encoded data to any compatible rtsp client. There is a good reference on the FAQ for how to do this http://www.live555.com/liveMedia/faq.html#liveInput which walks you through the steps taken, and provides example source code which you can modify for your needs.

FFMpeg encoding RGB images to H264

I'm developing a DirectShow filter which has 2 input pins (1 for audio, 1 for video). I'm using libavcodec/libavformat/libavutil of FFMpeg for encoding the video to H264, audio to AAC and mux it/stream using RTP. So far I was able to encode video and audio correctly using libavcodec but now I see that FFMpeg seems to support RTP muxing too. Unfortunatelly, I can't find any example code which shows how to perform H264 encoding and RTP muxing. Does anybody know good samples?
Try checking out the code in HandBrake. Specifically, this file muxmp4.c, which was a jem I found working with FFMpeg / RTP. Be sure and use av_interleaved_write_frame() and the extradata fields correctly. Those were some key differences I remember for RTP.
Still, I had some stability issues with RTP/RTSP with FFMpeg, (I'm sure it's getting better). I had much better luck with live555, and you can look at the code in VLC and MPlayer for good examples on how to use it.