I will be having raw h264 stream in a pipe and I have to serve it with RTSP.
How to do this in python and gstreamer.
Currently I'm able to play the stream with ffmpeg.
ffmpeg = subprocess.Popen(
["ffmpeg", "-i", "-", "-f", "rawvideo", '-vcodec', 'rawvideo', '-pix_fmt', 'rgb24', "pipe:1"], stdin=f,
stdout=subprocess.PIPE)
Related
I'm using ffmpeg library for live streaming via RTMP. I want to know how to give my choice of audio and video codec for the particular format in avformat_alloc_output_context2.
In Detail:
The following command works perfectly for me.
ffmpeg -re -stream_loop -1 -i ~/Downloads/Microsoft_Surface.mp4 -vcodec copy -c:a aac -b:a 160k -ar 44100 -strict -2 -f flv -flvflags no_duration_filesize rtmp://192.168.1.7/live/surface
In the output, I have set my audio codec to be aac and copied the video codec from input, which is H264.
I want to emulate this in the library, but don't know how to.
avformat_alloc_output_context2(&_ctx, NULL, "flv", NULL);
Above code sets oformat audio codec to ADPCM_SWF and video codec to FLV1. How to change that to AAC and H264 ?
So far, used av_guess_format to construct AVOutputFormat. It accepts only format as input. And I don't know where to mention audio and video codec.
AVOutputFormat* output_format = av_guess_format("flv", NULL, NULL);
Also tried giving filename to avformat_alloc_output_context2 with the rest of the parameters NULL.
AVOutputFormat* output_format = av_guess_format(NULL, "flv_acc_sample.flv", NULL);
This file has AAC audio and H264 video. But still ffmpeg loads oformat with ADPCM_SWF audio and FLV1 video codecs.
Searched stackoverflow for similar questions, but could not find the solution I was looking for.
Any hint/guidance is hugely appreciated. Thank you.
I’m developing an app that needs to clone an MP4 video file with all the streams using FFmpeg C++ API and have successfully made it work based on the FFmpeg remuxing example.
This works great for video and audio streams, but when the video includes a data stream (actually a QuickTime Time Code according to MediaInfo) I get this error.
Output #0, mp4, to 'C:\Users\user\Desktop\shortOut.mp4':
Stream #0:0: Video: hevc (Main 10) (hev1 / 0x31766568), yuv420p10le(tv,progressive), 3840x2160 [SAR 1:1 DAR 16:9], q=2-31, 1208 kb/s
Stream #0:1: Audio: mp3 (mp4a / 0x6134706D), 48000 Hz, stereo, s16p, 32s
Stream #0:2: Data: none (tmcd / 0x64636D74), 0 kb/s
[mp4 # 0000000071edf600] Could not find tag for codec none in stream #2, codec not currently supported in container
I’ve found this happens in the call to avformat_write_header().
It makes sense that if FFmpeg doesn’t know the codec it can’t write to the header about it, but I found out that using the ffmpeg command line I can make it to work perfectly using the copy command for the stream, something like:
ffmpeg -i input.mp4 -c:v copy -c:a copy -c:a copy output.mp4
I have been analyzing ffmpeg.c implementation to try to understand how they do a stream copy, but it’s been very painful following along the huge pipeline.
What would be a proper way to remux a data stream of this type with FFmpeg C++ API? Any tip or pointers?
I can open RTSP video stream by using the VideoCapture and read the frames.
It is the URL of RTSP stream:
rtsp://username:password#xxx.xxx.xxx.xxx:554/Streaming/Channels/102
Now I want to send/write the image/mat back to the output stream (RTMP stream over LAN network)
I have already set up the RTMP server which is NGINX, and for a test, I downloaded the FFMPEG, and when running the following command (in CMD) then it works well, and successfully read and write the stream to NGINX server.
ffmpeg -i rtsp://username:password#xxx.xxx.xxx.xxx:554/Streaming/Channels/102 -vcodec copy -acodec copy -f flv rtmp://xxx.xxx.xxx.xxx:1395/mylive/test
And now if I put this rtmp://xxx.xxx.xxx.xxx:1395/mylive/test URL to the video tag of HTML, then the video can be opened.
So the question is how I can push the image through RTMP (NGINX) from my code after processing?
Any suggestion. Thanks in advance!!
Edit:
I used VideoWriter class as the following ways, but no one was working:
VideoWriter writer = new VideoWriter();
writer.open("rtmp://xxx.xxx.xxx.xxx:1395/mylive/test", CAP_FFMPEG, 0, 25,
new Size(640, 480));
//writer.open("ffmpeg -vcodec copy -acodec copy -f flv
//rtmp://xxx.xxx.xxx.xxx:1395/mylive/test", CAP_FFMPEG, 0, 25, new Size(640,
//480));
// or without CAP_FFMPEG
if (!writer.isOpened()) {
System.out.println("open error");
}
I have been able to stream video using live555 on its own as well as audio to stream using live555 on its own.
But I want to have the video and audio playing on the same VLC. My video is h264 encoded and audio is AAC encoded. What do I need to do to pass these packets into a FramedSource.
What MediaSubsession/DeviceSource do I override, as this is not a fixed file but live video/live audio?
Thanks in advance!
In order to stream video/H264 and audio/MPEG4-GENERIC in the same RTSP unicast session you should do something like :
#include "liveMedia.hh"
#include "BasicUsageEnvironment.hh"
int main()
{
TaskScheduler* scheduler = BasicTaskScheduler::createNew();
BasicUsageEnvironment* env = BasicUsageEnvironment::createNew(*scheduler);
RTSPServer* rtspServer = RTSPServer::createNew(*env);
ServerMediaSession* sms = ServerMediaSession::createNew(*env);
sms->addSubsession(H264VideoFileServerMediaSubsession::createNew(*env, "test.264",false));
sms->addSubsession(ADTSAudioFileServerMediaSubsession::createNew(*env, "test.aac",false));
rtspServer->addServerMediaSession(sms);
}
I encode h264 data by libavcodec.
ex.
while (1) {
...
avcodec_encode_video(pEnc->pCtx, OutBuf, ENC_OUTSIZE, pEnc->pYUVFrame);
...
}
If I directly save OutBuf data as a .264 file, it can`t be play by player. Now I want to save OutBuf
as a mp4 file. Anyone know how to do this by ffmpeg lib? thanks.
You use avformat_write_header, av_interleaved_write_frame, avformat_write_trailer and friends.
Their usage is shown in the muxing example of FFmpeg.
See a similar topic: Raw H264 frames in mpegts container using libavcodec with also writing to a file (different container, same API)
See also links from answer here: FFMpeg encoding RGB images to H264