How to Skip frames while decoding H264 stream? - c++

I'm using FFMPEG to decode H264 (or H265) RTSP Stream.
My system have 2 software: Server and Client
Server: Read frames from RTSP stream --> Forward frames to Client
Client: Receive frames from Server --> Decode --> Render
I have implemented and it worked ok, but there is a case make my system work not good. That is when internet from Server - Client is slow, frames can not transfer real-time to Client.
In present, I deal with this issue by Skip some frames (not send to Client) when the Queue is reached limit of count. The following is my summary code
//At Server Software (include 2 threads A and B)
//Thread A: Read AVPacket and forward to Client
while(true)
{
AVPacket packet;
av_init_packet(&packet);
packet.size = 0;
packet.data = NULL;
int ret = AVERROR(EAGAIN);
while (AVERROR(EAGAIN) == ret)
ret = av_read_frame(pFormatCtx, &packet);
if(packet.size > 0)
{
if(mySendQueue.count < 120) //limit 120 packet in queue
mySendQueue.Enqueue(packet); ////Thread B will read from this queue, to send packets to Client via TCP socket
else
;//SkipThisFrame ***: No send
}
}
//Thread B: Send To Client via TCP Socket
While(true)
{
AVPacket packet;
if(mySendQueue.Dequeue(packet))
{
SendPacketToClient(packet);
}
}
//At Server Software : Receive AVPacket from Server --> Decode --> Render
While(true)
{
AVPacket packet;
AVFrame frame;
ReadPacketFromServer(packet);
if (av_decode_asyn(pCodecCtx, &frame, &frameFinished, &packet) == RS_OK)
{
if (frameFinished)
{
RenderFrame(frame);
}
}
}
UINT32 __clrcall av_decode_asyn(AVCodecContext *pCodecCtx, AVFrame *frame, int *frameFinished, AVPacket *packet)
{
int ret = -1;
*frameFinished = 0;
if (packet)
{
ret = avcodec_send_packet(pCodecCtx, packet);
// In particular, we don't expect AVERROR(EAGAIN), because we read all
// decoded frames with avcodec_receive_frame() until done.
if (ret < 0 && ret != AVERROR_EOF)
return RS_NOT_OK;
}
ret = avcodec_receive_frame(pCodecCtx, frame);
if (ret < 0 && ret != AVERROR(EAGAIN))
{
return RS_NOT_OK;
}
if (ret >= 0)
*frameFinished = 1;
return RS_OK;
}
My question is focus in line of code SkipThisFrame ***, this algorithm skip frame continuously, so it maybe make the decoder on Client occur unexpectedly error or Crash?
And when skip frame like that, make Client Render frames is not normally?
And someone call show me the proper algorithm to skip frames in my case?
Thank you very much!

I have a brief read on doc of AVPacket, it says:
For video, it should typically contain one compressed frame.
Theoretically you cannot skip frames for a compressed video stream, as most frames do not contain complete information about that frame's image, but only contain changes compared with some previous frames. So if you skip a frame, it is probable that many trailing decoded frames won't contain correct result (until next key frame flushes whole image).

"My question is focus in line of code SkipThisFrame ***, this algorithm
skip frame continuously, so it maybe make the decoder on Client occur
unexpectedly error or Crash?"
One thing I notice is wrong...
Your While(true) statements also need a break; to stop, otherwise they will run forever, blocking other functions and causing the system to crash. Think about it, you say "While the loop is true do X-Y-Z instructions" but you never say when to stop (eg: break out of this While loop to do next instructions). Computer is stuck doing first While loop only and also repeating that to infinity...
Try setting up like this:
//At Server Software (include 2 threads A and B)
//Thread A: Read AVPacket and forward to Client
while(true)
{
AVPacket packet;
av_init_packet(&packet);
packet.size = 0;
packet.data = NULL;
int ret = AVERROR(EAGAIN);
while (AVERROR(EAGAIN) == ret) { ret = av_read_frame(pFormatCtx, &packet); }
if(packet.size > 0)
{
if(mySendQueue.count < 120) //limit 120 packet in queue
{
mySendQueue.Enqueue(packet); ////Thread B will read from this queue, to send packets to Client via TCP socket
}
//else { } //no need for ELSE if doing nothing... //SkipThisFrame ***: No send
}
break; //stop this part and move to "Thead B"
}
//Thread B: Send To Client via TCP Socket
While(true)
{
AVPacket packet;
if( mySendQueue.Dequeue(packet) )
{ SendPacketToClient(packet); break; }
}
//At Server Software : Receive AVPacket from Server --> Decode --> Render
While(true)
{
AVPacket packet; AVFrame frame;
ReadPacketFromServer(packet);
if (av_decode_asyn(pCodecCtx, &frame, &frameFinished, &packet) == RS_OK)
{
if (frameFinished) { RenderFrame(frame); break; }
}
}
UINT32 __clrcall av_decode_asyn(AVCodecContext *pCodecCtx, AVFrame *frame, int *frameFinished, AVPacket *packet)
{
int ret = -1;
*frameFinished = 0;
if (packet)
{
ret = avcodec_send_packet(pCodecCtx, packet);
// In particular, we don't expect AVERROR(EAGAIN), because we read all
// decoded frames with avcodec_receive_frame() until done.
if (ret < 0 && ret != AVERROR_EOF)
return RS_NOT_OK;
}
ret = avcodec_receive_frame(pCodecCtx, frame);
if (ret < 0 && ret != AVERROR(EAGAIN))
{
return RS_NOT_OK;
}
if (ret >= 0)
*frameFinished = 1;
return RS_OK;
}
Hope it helps. Let me know of results / errors.

Related

FFmpeg AVERROR(EAGAIN) error when call avcodec receive for h264

I'm working with ffmpeg 4.1 and I'm showing live streams of multiple cameras, h264 and h265.
My program collects packets of the same frame and then calls decodeVideo function. Actually it sends all packets of a frame at once.
Program works well if there is no missing packets. When I remove packet in random I-frames, both h264 and h265 streams work as expected (jumps some seconds but continues streaming).
When I remove packet in random P-frame from h265 streams, avcodec_send_packet function gives AVERROR_INVALIDDATA and streams continue.
However when I remove packet in random P-frame from h264 streams, avcodec_send_packet function gives 0. Then avcodec_receive_frame function gives AVERROR(EAGAIN) continuously and streams freeze.
void decodeVideo(array<uint8_t>^ data, int length, AvFrame^ finishedFrame)
{
AVPacket* videoPacket = new AVPacket();
av_init_packet(videoPacket);
pin_ptr<unsigned char> dataPtr = &data[0];
videoPacket->data = dataPtr;
videoPacket->size = length;
int retVal = avcodec_send_packet((AVCodecContext*)context, videoPacket);
if(retVal < 0)
{
if (retVal == AVERROR_EOF)
Utility::Log->ErrorFormat("avcodec_send_packet() return value is AVERROR_EOF.");
else if( retVal == AVERROR_INVALIDDATA)
Utility::Log->ErrorFormat("avcodec_send_packet() INVALID DATA!");
else
Utility::Log->ErrorFormat("avcodec_send_packet() return value is negative:{0}",retVal);
}
else
{
int receive_frame = avcodec_receive_frame((AVCodecContext*)context, (AVFrame*)finishedFrame);
if (receive_frame == AVERROR(EAGAIN))
Utility::Log->ErrorFormat("avcodec_receive_frame() returns AVERROR(EAGAIN)");
else if(receive_frame == AVERROR_EOF)
Utility::Log->ErrorFormat("avcodec_receive_frame() returns AVERROR(AVERROR_EOF)");
else
Utility::Log->ErrorFormat("avcodec_receive_frame() return value is negative:{0}",receive_frame);
}
av_packet_unref(videoPacket);
delete videoPacket;
}
EDIT
When I add avcodec_flush_buffers like shown, my problem is temporarily solved. However it freeze again after a while.
if(receive_frame == AVERROR(EAGAIN))
{
Utility::Log->ErrorFormat("avcodec_receive_frame() returns AVERROR(EAGAIN)");
avcodec_flush_buffers((AVCodecContext*)context);
}
Tested with ffmpeg version 4.1.1 same results.
Find an ffmpeg version like 2.5 decode function is different but there is no problem when i remove packets. However I'm working with h265 streams too.
EDIT2
AVCodecID id = AVCodecID::AV_CODEC_ID_H264;
AVCodec* dec = avcodec_find_decoder(id);
AVCodecContext* decContext = avcodec_alloc_context3(dec);
After these lines, my code included the following lines. When i delete them, there is no problem now.
if(dec->capabilities & AV_CODEC_CAP_TRUNCATED)
decContext->flags |= AV_CODEC_FLAG_TRUNCATED;
decContext->flags2 |= AV_CODEC_FLAG2_CHUNKS;

How to decode MJPEG with FFmpeg

I am trying to decode a MJPEG stream with libav. The stream comes from V4L2 driver. When working with high resolutions the following code works fine. However, when using low resolutions (For instance 320x190) it produces strange artefacts
void V4L2::compressedCapturingThread(){
const AVCodec *codec=avcodec_find_decoder(m_currVidMode.codec.toAVCodecID());
if(!codec)
return; //Something went horribly wrong
//Allocate the context
AVCodecContext* codecCtx=avcodec_alloc_context3(codec);
if(!codecCtx){
return; //Error
}
codecCtx->width=m_currVidMode.res.width;
codecCtx->height=m_currVidMode.res.height;
//Open the context
if (avcodec_open2(codecCtx, codec, nullptr) < 0) {
avcodec_free_context(&codecCtx);
return;
}
//Allocate space for the packet
AVPacket *pkt=av_packet_alloc();
if(!pkt){
avcodec_free_context(&codecCtx);
return;
}
AVFrame* decodedFrame=av_frame_alloc();
if(!decodedFrame){
avcodec_free_context(&codecCtx);
avcodec_free_context(&codecCtx);
return;
}
Graphics::Uploader uplo;
v4l2_buffer buf;
int ret;
Utils::ImageBuffer decodedImgBuf(
Utils::ImageAttributes(
m_currVidMode.res,
m_currVidMode.pixFmt
), (u_int8_t*)nullptr
);
//Main loop
while(!m_threadExit){
reqBuffer(&buf);
if(!buf.bytesused){
continue;
}
//Create the packet with the given data
u_int8_t* bufData=(u_int8_t*)av_malloc(buf.bytesused);
memcpy(bufData, m_buffers[buf.index].buffer, buf.bytesused); //copy the data
av_packet_from_data (pkt, bufData, buf.bytesused);
freeBuffer(&buf); //V4L2 buffer no longer needed
//Try to decode the packet
ret=avcodec_send_packet(codecCtx, pkt);
av_packet_unref(pkt);
if(ret<0){
continue;
}
ret = avcodec_receive_frame(codecCtx, decodedFrame);
if(ret<0){
continue;
}
memcpy(decodedImgBuf.data, decodedFrame->data, sizeof(decodedImgBuf.data)); //Copy plane pointers
if(decodedImgBuf.att.pixFmt == Utils::PixelFormats::NONE){
decodedImgBuf.att.pixFmt=Utils::PixelFormat(codecCtx->pix_fmt);
//Change deprecated formats
if(decodedImgBuf.att.pixFmt == Utils::PixelFormats::YUVJ420P)
decodedImgBuf.att.pixFmt = Utils::PixelFormats::YUV420P;
else if(decodedImgBuf.att.pixFmt == Utils::PixelFormats::YUVJ422P)
decodedImgBuf.att.pixFmt = Utils::PixelFormats::YUV422P;
else if(decodedImgBuf.att.pixFmt == Utils::PixelFormats::YUVJ440P)
decodedImgBuf.att.pixFmt = Utils::PixelFormats::YUV440P;
else if(decodedImgBuf.att.pixFmt == Utils::PixelFormats::YUVJ444P)
decodedImgBuf.att.pixFmt = Utils::PixelFormats::YUV444P;
}
std::unique_ptr<const Graphics::Frame> frame;
{
Graphics::UniqueContext ctx(Graphics::Context::getAvalibleCtx());
frame=uplo.getFrame(decodedImgBuf);
}
Stream::AsyncSource<Graphics::Frame>::push(std::move(frame));
}
//Free everything
avcodec_free_context(&codecCtx);
av_packet_free(&pkt);
av_frame_free(&decodedFrame);
}
If I try to read the AVFrame's contents to disk just after avcodec_receive_frame() I get the mentioned "strange results" (I see it by uploading it to rawpixels.net ), so the problem is not after this line. If I save the pkt 's data to disk as JPEG just before freeBuffer() the image can be seen properly. I'll attach some pictures
V4L2 configured at 1280x720
V4L2 configured at 320x190
The complete code can be found at:
https://github.com/oierlauzi/zuazo
https://github.com/oierlauzi/zuazo/blob/master/src/Zuazo/Sources/V4L2.cpp
Edit 1:
I forgot to mention, codecCtx->pix_fmt has the value of AL_PIX_FMT_YUVJ422P (given by libav)

C++ ffmpeg video missing frames and won't play in Quicktime

I wrote some C++ code that uses ffmpeg to encode a video. I'm having two strange issues:
The final video is always missing 1 frame. That is, if I have it encode 10 frames the final video only has 9 (at least that's what ffprobe -show_frames -pretty $VIDEO | grep -F '[FRAME]' | wc -l tells me.
The final video plays fine in some players (mpv and vlc) but not in Quicktime. Quicktime just shows a completely black screen.
My code is roughly this (modified a bit to remove types that are unique to our code base):
First, I open the video file, write the headers and initialize things:
template <class PtrT>
using UniquePtrWithDeleteFunction = std::unique_ptr<PtrT, std::function<void (PtrT*)>>;
std::unique_ptr<FfmpegEncodingFrameSink> FfmpegEncodingFrameSink::Create(
const std::string& dest_url) {
AVFormatContext* tmp_format_ctxt;
auto alloc_format_res = avformat_alloc_output_context2(&tmp_format_ctxt, nullptr, "mp4", dest_url.c_str());
if (alloc_format_res < 0) {
throw FfmpegException("Error opening output file.");
}
auto format_ctxt = UniquePtrWithDeleteFunction<AVFormatContext>(
tmp_format_ctxt, CloseAvFormatContext);
AVStream* out_stream_video = avformat_new_stream(format_ctxt.get(), nullptr);
if (out_stream_video == nullptr) {
throw FfmpegException("Could not create outputstream");
}
auto codec_context = GetCodecContext(options);
out_stream_video->time_base = codec_context->time_base;
auto ret = avcodec_parameters_from_context(out_stream_video->codecpar, codec_context.get());
if (ret < 0) {
throw FfmpegException("Failed to copy encoder parameters to outputstream");
}
if (!(format_ctxt->oformat->flags & AVFMT_NOFILE)) {
ret = avio_open(&format_ctxt->pb, dest_url.c_str(), AVIO_FLAG_WRITE);
if (ret < 0) {
throw VideoDecodeException("Could not open output file: " + dest_url);
}
}
ret = avformat_init_output(format_ctxt.get(), nullptr);
if (ret < 0) {
throw FfmpegException("Unable to initialize the codec.");
}
ret = avformat_write_header(format_ctxt.get(), nullptr);
if (ret < 0) {
throw FfmpegException("Error occurred writing format header");
}
return std::unique_ptr<FfmpegEncodingFrameSink>(
new FfmpegEncodingFrameSink(std::move(format_ctxt), std::move(codec_context)));
}
Then, every time I get a new frame to encode I pass it to this function (the frames are being decoded via ffmpeg from another mp4 file which Quicktime plays just fine):
// If frame == nullptr then we're done and we're just flushing the encoder
// otherwise encode an actual frame
void FfmpegEncodingFrameSink::EncodeAndWriteFrame(
const AVFrame* frame) {
auto ret = avcodec_send_frame(codec_ctxt_.get(), frame);
if (ret < 0) {
throw FfmpegException("Error encoding the frame.");
}
AVPacket enc_packet;
enc_packet.data = nullptr;
enc_packet.size = 0;
av_init_packet(&enc_packet);
do {
ret = avcodec_receive_packet(codec_ctxt_.get(), &enc_packet);
if (ret == AVERROR(EAGAIN)) {
CHECK(frame != nullptr);
break;
} else if (ret == AVERROR_EOF) {
CHECK(frame == nullptr);
break;
} else if (ret < 0) {
throw FfmpegException("Error putting the encoded frame into the packet.");
}
assert(ret == 0);
enc_packet.stream_index = 0;
LOG(INFO) << "Writing packet to stream.";
av_interleaved_write_frame(format_ctxt_.get(), &enc_packet);
av_packet_unref(&enc_packet);
} while (ret == 0);
}
Finally, in my destructor I close everything up like so:
FfmpegEncodingFrameSink::~FfmpegEncodingFrameSink() {
// Pass a nullptr to EncodeAndWriteFrame so it flushes the encoder
EncodeAndWriteFrame(nullptr);
// write mp4 trailer
av_write_trailer(format_ctxt_.get());
}
If I run this passing n frames to EncodeAndWriteFrame line LOG(INFO) << "Writing packet to stream."; gets run n times indicating the n packets were written to the stream. But ffprobe always shows only n - 1 frames int he video. And the final video doesn't play on quicktime.
What am I doing wrong??
Sorry for the delay but as i just had the same problem and noticed that this question deserves an answer, here how i solved this.
Up in front, the Problem only occured for me when using mov, mp4, 3gp as format. It worked frame accurate when using e.g. avi format. When i wrote uncompressed video frames to the container, i saw that the avi and mov had the same count of frames stored but the mov obviously had some problem in it's header.
Counting the number of frames in the mov using header metadata showed one frame is missing:
ffprobe -v error -count_frames -select_streams v:0 -show_entries stream=nb_read_frames -of default=nokey=1:noprint_wrappers=1 c:\temp\myinput.mov
While ignoring the index showed the correct number of frames:
-ignore_editlist 1
The solution for me was, set the timebase to the AVStream->CodeContext of the video stream.
The code above attempts to do this in this line:
out_stream_video->time_base = codec_context->time_base;
But the problem is that the posted code above does not expose the function GetCodecContext so we do not know if the time_base is correctly set for "codec_context". So it is my believe that the author's problem was that his function GetCodecContext did not set the time_base correctly.

FFMPEG buffer underflow

I'm recording a video with FFMPEG and getting some wierd message in the process
[mpeg # 01011c80] packet too large, ignoring buffer limits to mux it
[mpeg # 01011c80] buffer underflow st=0 bufi=236198 size=412405
[mpeg # 01011c80] buffer underflow st=0 bufi=238239 size=412405
and I have no idea how to deal with it. Here's my code for adding frames
void ofxFFMPEGVideoWriter::addFrame(const uint8_t* pixels)
{
memcpy(picture_rgb24->data[0], pixels, size);
sws_scale(swsContext, picture_rgb24->data, picture_rgb24->linesize, 0, codecContext->height, picture->data, picture->linesize);
AVPacket packet = { 0 };
int got_packet;
av_init_packet(&packet);
int ret = avcodec_encode_video2(codecContext, &packet, picture, &got_packet);
if (ret < 0) qDebug() << "Error encoding video frame: " << ret;
if (!ret && got_packet && packet.size)
{
packet.stream_index = videoStream->index;
ret = av_interleaved_write_frame(formatContext, &packet);
}
picture->pts += av_rescale_q(1, videoStream->codec->time_base, videoStream->time_base);
}
The file itself seems to be fine, and readable, but that message is really bugging me. Does anybody know how to fix it?

IOCP and overwritten buffer

Well i make a IOCP for handling client connections with the following details:
- Threads = (CPU cores * 2)
- Assigning an completion port to each socket
- Accessing the socket context by Client Index or overlapped struct (either way is the same)
So i am trying to debug the incoming packets, its works like a charm, except for a little but nasty detail... I set a break point on WorkersThread function (where i recv the packet) i am watching the buffer with the packet i recv, when suddenly the buffer gets overwritten with a new packet that i got from client.
Why is that? according to what i read, IOCP should wait till i process the packet, send a response to client before recv any other packet. So i set a flag on my socket context called "Processing" and still got the overwritten buffer with an incoming packet. So it doesn't let me debug at all and its driving me crazy
Is ollydbg (debugger) fault that let the other threads running while i set a break point? Or is some error in my IOCP implementation?
Here is how my WorkerThread is coded:
DWORD WINAPI WorkerThread(void* argument)
{
int BytesTransfer;
int BytesRecv;
int ClientID;
int result;
OVERLAPPED* overlapped = 0;
ClientInfo* clientinfo = 0;
WSABUF wsabuf;
int flags;
//Exit only when shutdown signal is recv
while (WaitForSingleObject(IOCPBase::internaldata->sockcontext.ShutDownSignal, NULL) != WAIT_OBJECT_0)
{
flags = 0; BytesTransfer = 0; BytesRecv = 0; ClientID = 0;
//Get from queued list
if (GetQueuedCompletionStatus(IOCPBase::internaldata->sockcontext.CompletionPort, (LPDWORD)&BytesTransfer, (PULONG_PTR)&ClientID, &overlapped, INFINITE) == TRUE)
{
if (overlapped == 0)
{
//Fatal error
break;
}
clientinfo = (ClientInfo*)overlapped;
if (BytesTransfer != 0)
{
//Assign the buffer pointer and buffer len to WSABUF local
clientinfo->RecvContext.RecvBytes = BytesTransfer;
wsabuf.buf = (char*)clientinfo->RecvContext.Buffer;
wsabuf.len = clientinfo->RecvContext.Len;
//Switch for OperationCode
//switch (IOCPBase::internaldata->ClientContext[ClientID].OperationCode)
switch (clientinfo->OperationCode)
{
case FD_READ:
// Check if we have send all data to the client from a previous send
if (clientinfo->SendContext.SendBytes < clientinfo->SendContext.TotalBytes)
{
clientinfo->OperationCode = FD_READ; //We set FD_READ caused on the next send, there could still be bytes left to send
wsabuf.buf += clientinfo->SendContext.SendBytes; //The buffer position is + sended bytes
wsabuf.len = clientinfo->SendContext.TotalBytes - clientinfo->SendContext.SendBytes; //the buffer len is total - sended bytes
//Send the remain bytes
result = WSASend(clientinfo->sock, &wsabuf, 1, (LPDWORD)&BytesRecv, flags, &clientinfo->overlapped, NULL);
if (result == SOCKET_ERROR && (WSAGetLastError() != WSA_IO_PENDING))
{
CloseClient(ClientID);
}
clientinfo->SendContext.SendBytes += BytesRecv;
}
else
{
if (clientinfo->Processing == 0)
{
clientinfo->OperationCode = FD_WRITE; //If no more bytes left to send now we can set the operation code to write (in fact is read)
memset(clientinfo->RecvContext.Buffer, NULL, MAX_DATA_BUFFER_SIZE); //Clean the buffer for recv new data
//Recv data from our client
clientinfo->RecvContext.RecvBytes = WSARecv(clientinfo->sock, &wsabuf, 1, (LPDWORD)&BytesRecv, (LPDWORD)&flags, &clientinfo->overlapped, NULL);
if (clientinfo->RecvContext.RecvBytes == SOCKET_ERROR && WSAGetLastError() != WSA_IO_PENDING)
{
CloseClient(ClientID);
break;
}
}
}
break;
case FD_WRITE:
//Send data to the RecvProtocol
clientinfo->Processing = 1;
IOCPBase::internaldata->callback.RecvProtocol(clientinfo->RecvContext.Buffer, clientinfo->RecvContext.Len, ClientID);
clientinfo->Processing = 0;
default:
break;
}
}
}
}
return false;
}
The problem appears when looking at clientinfo->RecvContext.Buffer. I am watching the packet, past a few seconds and boom the buffer is overwritten with a new packet.
Thanks !
Never mind i fix the debug problem by copy the packet to the stack frame of the function i use to analyze the packet, this way i have no overwritten problem.