I have been able to stream video using live555 on its own as well as audio to stream using live555 on its own.
But I want to have the video and audio playing on the same VLC. My video is h264 encoded and audio is AAC encoded. What do I need to do to pass these packets into a FramedSource.
What MediaSubsession/DeviceSource do I override, as this is not a fixed file but live video/live audio?
Thanks in advance!
In order to stream video/H264 and audio/MPEG4-GENERIC in the same RTSP unicast session you should do something like :
#include "liveMedia.hh"
#include "BasicUsageEnvironment.hh"
int main()
{
TaskScheduler* scheduler = BasicTaskScheduler::createNew();
BasicUsageEnvironment* env = BasicUsageEnvironment::createNew(*scheduler);
RTSPServer* rtspServer = RTSPServer::createNew(*env);
ServerMediaSession* sms = ServerMediaSession::createNew(*env);
sms->addSubsession(H264VideoFileServerMediaSubsession::createNew(*env, "test.264",false));
sms->addSubsession(ADTSAudioFileServerMediaSubsession::createNew(*env, "test.aac",false));
rtspServer->addServerMediaSession(sms);
}
Related
I have to get a processed video from gstreamer pipe, compress it based on h264 or h265 alg. and then write it to storage. There are some problems in this project that must be handled:
Saved video must be playable by any standard video players such as vlcplaye, windows media player, kmplayer and ...
If for any reason the destination file does not close properly (such as a power outage), the entire file should not be lost and the saved video should be playable until the problem occurs.
My solution to this project with these constraints, is an opencv writer with a gstreamer pipe as follow:
...
std::string gstPipe("appsrc ! videoconvert ! omxh264enc ! "
"splitmuxsink muxer=matroskamux "
"max-size-time=50000000000 location="
"/file/path/save%d.mkv");
cv::Size frameSize(frameWidth, frameHeight);
bool result = videoWriter.open(gstPipe, cv::CAP_GSTREAMER, 0,
fps, frameSize);
This solution splits a video stream into multiple files, but it is needed to save whole video in one file.
Does anyone have a better solution to offer?
Thank you very much in advance for your helps.
I was writing as I could not find the answer in previous topics. I am using live555 to stream live video (h264) and audio(g723), which are being recorded by a web camera. The video part is already done and it works perfectly, but I have no clue about the audio task.
As long as I have read I have to create a ServerMediaSession to which I should add two subsessions: one for the video and one for the audio. For the video part I created a subclass of OnDemandServerMediaSubsession, a subclass of FramedSource and the Encoder class, but for the audio aspect I do not know on which classes should I base the implementation.
The web camera records and delivers audio frames in g723 format separatedly from the video. I would say the audio is raw as when I try to play it in VLC it says that it could not find any startcode; so I suppose it is the raw audio stream what is recorded by the web cam.
I was wondering if someone could give me a hint.
For an audio stream ,your override of OnDemandServerMediaSubsession::createNewRTPSink should create a SimpleRTPSink.
Something like :
RTPSink* YourAudioMediaSubsession::createNewRTPSink(Groupsock* rtpGroupsock, unsigned char rtpPayloadTypeIfDynamic, FramedSource* inputSource)
{
return SimpleRTPSink::createNew(envir(), rtpGroupsock,
4,
frequency,
"audio",
"G723",
channels );
}
The frequency and the number of channels should comes from the inputSource.
I am trying to find a method in the live555 class that takes in a
video file and turns it into a rtsp stream. Could someone please
tell me about this method?
int main()
{
scheduler = BasicTaskScheduler::createNew();
}
I am writing client-server system that uses FFMPEG library to parse H.264 stream into NAL units on the server side, then uses channel coding to send them over network to client side, where my application must be able to play video.
The question is how to play received AVPackets (NAL units) in my application as video stream.
I have found this tutorial helpful and used it as base for both server and client side.
Some sample code or resource related to playing video not from file, but from data inside program using FFMPEG library would be very helpful.
I am sure that received information will be sufficient to play video, because I tried to save received data as .h264 or .mp4 file and it can be played by VLC player.
Of what I understand from your question, you have the AVPackets and want to play a video. In reality this is two problems; 1. decoding your packets, and 2. playing the video.
For decoding your packets, with FFmpeg, you should take a look at the documentation for AVPacket, AVCodecContext and avcodec_decode_video2 to get some ideas; the general idea is that you want to do something (just wrote this in the browser, take with a grain of salt) along the lines of:
//the context, set this appropriately based on your video. See the above links for the documentation
AVCodecContext *decoder_context;
std::vector<AVPacket> packets; //assume this has your packets
...
AVFrame *decoded_frame = av_frame_alloc();
int ret = -1;
int got_frame = 0;
for(AVPacket packet : packets)
{
avcodec_get_frame_defaults(frame);
ret = avcodec_decode_video2(decoder_context, decoded_frame, &got_frame, &packet);
if (ret <= 0) {
//had an error decoding the current packet or couldn't decode the packet
break;
}
if(got_frame)
{
//send to whatever video player queue you're using/do whatever with the frame
...
}
got_frame = 0;
av_free_packet(&packet);
}
It's a pretty rough sketch, but that's the general idea for your problem of decoding the AVPackets. As for your problem of playing the video, you have many options, which will likely depend more on your clients. What you're asking is a pretty large problem, I'd advise familiarizing yourself with the FFmpeg documentation and the provided examples at the FFmpeg site. Hope that makes sense
I'm developing an application to send video over RTP to a client that can play only H.263 (1996) and H263+ (1998).
To do this i've encoded the video using libav following these steps: (this is only part of the code)
av_register_all();
avformat_network_init();
Fmt = av_guess_format("rtp", NULL, NULL);
...
st = add_video_stream(FmtCtx, CODEC_ID_H263);
...
avio_open(&FmtCtx->pb, rtp_url, URL_WRONLY)
To finally enter a loop where i encode the video, the problem is that the stream generated by this program is encoded in H.263-2000 (or H.263++) which the other side cannot undertand, even though i use CODEC_ID_H263 or CODEC_ID_H263P in the initialization the same thing happens.
Is it possible to encode in those old H.263 versions using libav? i havent managed to do it not even using ffmpeg commands. The stream is always h.263-2000 (PT=96)