How to take snapshot in libVLC without showing the Media Player - c++

I am trying to take snapshot from VLC player using libVLC. But whenever I run the following code, a window opens showing the video streaming and I don't want to open the media player window while doing that. I am taking the video input from an IP camera using RTSP link. Is there a way where I can achieve my goal by keeping the media player window hidden?
Here is my code that I've done till now.
m = libvlc_media_new_location(inst, "IP/camera/rtsp/link");
mp = libvlc_media_player_new_from_media(m);
libvlc_media_player_play(mp);
while (1) {
Sleep(500);
const char* image_path = "E:\\frames\\image.jpg";
int result = libvlc_video_take_snapshot(mp, 0, image_path, 0, 0);
}
libvlc_media_player_stop(mp);
libvlc_media_player_release(mp);
libvlc_release(inst);

Thank for your question. Add
const char* const vlc_args[] = {
"--intf", "dummy",
"--vout", "dummy",
};
when creating libvlc new inst and pass it as argument.

Related

Capture a frame (image) from a video playing in a QT GUI

I have written a simple video player GUI code in QT. The GUI allows the user to browse the local files and select a video for playing in the GUI. The GUI also has options for 'play', 'pause' and 'stop' to apply to the video selected.
I want to add another button 'Capture', that captures the current frame of the video that is being played, and displays this captured image next to the video (The video should should get paused at this point).
I looked into the documentation of QT, specifically: this and this. But I am still not able to understand how to implement this in my case.
Kindly guide.
My code so far is as follows:
#include "qtwidgetsapplication4.h"
#include <iostream>
QtWidgetsApplication4::QtWidgetsApplication4(QWidget *parent)
: QMainWindow(parent)
{
ui.setupUi(this);
player = new QMediaPlayer(this);
vw = new QVideoWidget(this);
player->setVideoOutput(vw);
this->setCentralWidget(vw);
}
void QtWidgetsApplication4::on_actionOpen_triggered() {
QString filename = QFileDialog::getOpenFileName(this, "Open a File", "", "Video File (*.*)");
on_actionStop_triggered();
player->setSource(QUrl::fromLocalFile(filename));
on_actionPlay_triggered();
qDebug("Error Message in actionOpen");
qDebug()<<player->mediaStatus();
}
void QtWidgetsApplication4::on_actionPlay_triggered() {
player->play();
ui.statusBar->showMessage("Playing");
qDebug("Error Message in actionPlay");
qDebug() << player->mediaStatus();
}
void QtWidgetsApplication4::on_actionPause_triggered() {
player->pause();
ui.statusBar->showMessage("Paused...");
qDebug("Error Message in actionPause");
qDebug() << player->mediaStatus();
}
void QtWidgetsApplication4::on_actionStop_triggered() {
player->stop();
ui.statusBar->showMessage("Stopped");
qDebug("Error Message in actionStop");
qDebug() << player->mediaStatus();
}
You can use QVideoProbe to capture QVideoFrame. Instantiate and connect videoFrameProbed signal to your slot before pausing video. Convert QVideoFrame to QImage in this slot using frame data.
QImage::Format imageFormat = QVideoFrame::imageFormatFromPixelFormat(frame.pixelFormat());
QImage image(frame.bits(), frame.width(), frame.height(), imageFormat);
Take a look at player example for reference. It uses QVideoProbe to calculate histogram of current frame.

QtGstreamer camerabin2 usage

I'm working on olimex a13 board with just eglfs i.e, no windowing system. Because of this Qt Multimedia stuff video and camera aren't working as Qt uses Gstreamer which in turn needs X. So I'm using QtGstreamer library which is here.
I've followed the examples and created a media player which is working as expected. Now, I want to do a camera and using camerabin2 which is from bad plugins.
This is my code:
//init QGst
QGst::init(&argc, &argv);
//create video surface
QGst::Quick::VideoSurface* videoSurface = new QGst::Quick::VideoSurface(&engine);
CameraPlayer player;
player.setVideoSink(videoSurface->videoSink());
//cameraplayer.cpp
void open()
{
if (!m_pipeline) {
m_pipeline = QGst::ElementFactory::make("camerabin").dynamicCast<QGst::Pipeline>();
if (m_pipeline) {
m_pipeline->setProperty("video-sink", m_videoSink);
//watch the bus for messages
QGst::BusPtr bus = m_pipeline->bus();
bus->addSignalWatch();
QGlib::connect(bus, "message", this, &CameraPlayer::onBusMessage);
//QGlib::connect(bus, "image-done", this, &CameraPlayer::onImageDone);
} else {
qCritical() << "Failed to create the pipeline";
}
}
}
//-----------------------------------
void CameraPlayer::setVideoSink(const QGst::ElementPtr & sink)
{
m_videoSink = sink;
}
//-----------------------------------
void CameraPlayer::start()
{
m_pipeline->setState(QGst::StateReady);
m_pipeline->setState(QGst::StatePlaying);
}
I then call cameraPlayer.start() which isn't working i.e, no video. Am I missing something here? Has anyone used QtGstreamer to stream webcam? Thanks in advance.
I realised some plugins (multifilesink) were missing. Started my Qt application with --gst-debug-level=4 argument and gstreamer then reported about missing plugins.

opencv mp3 header missing

Hi i'm new to the openCV library. Just really installed in today and i was trying to do some basic stuff like show a picture in a window and show a video in a window.
I got both of these to work but when i try and show the video it plays without sound and i get the following
[mp3 # 0x107808800] Header missing
in the console. How do i add the mp3 header so it plays with sound?
here is my code
int main(int argc, const char * argv[])
{
if (argc<2) {
printf("not enough arguments");
return -1;
}
//create a window
namedWindow(windowName,WINDOW_AUTOSIZE);
Mat frame;
//capture video from file
VideoCapture capture;
capture.open(argv[1]);
int run=1;
while (1) {
//make play and pause feature
if (run !=0) {
capture>>frame;
imshow(windowName, frame);
}
char c=waitKey(33);
if (c=='p') {
run=1;
}
if (c=='s') {
run=0;
}
if (c ==27 || c=='q') {
break;
}
}
return 0;
}
you can't .
audio gets discarded (that's the message you get), and there's no way to retrieve it again.

cannot convert 'cv::VideoCapture' to 'CvCapture*'

I have a simple program which takes a video and plays it (though it does some image processing on the video). The video can be retrieved from a Dialog Box result, or directly by giving the path of the file. When I use cv::CvCapture capture1, I get the properties like capture1.isOpen(), capture1.get(CV_CAP_PROP_FRAME_COUNT) etc. but when I use CvCapture* capture2 I get weird errors.
I want to use cv::CvCapture capture1 because of my functions are in accordance with capture1. Or is there any way to use both types with some kind of conversion between them like type casting or something else.
Actually I had two programs, the functions of the program1 was for cv::CvCapture and the functions of the program2 was for CvCapture*. I mean the two programs read the video file in different manners.
I then merged these two programs to use some functions from program1 and some functions from program2. But I can't convert cv::CvCapture to CvCapture*.
I am using OpenCv with Qt Creator.
My code is very long to post here but I have simplified my code to make it smaller and understandable. My code may not compile correctly because I modified it to make it simpler.
Any help would be appreciated. Thanks in advance :)
void MainWindow::on_pushButton_clicked()
{
std::string fileName = QFileDialog::getOpenFileName(this,tr("Open Video"), ".",tr("Video Files (*.mp4 *.avi)")).toStdString();
cv::VideoCapture capture1(fileName); // when I use the cv::VideoCapture capture it gives an error
//error: cannot convert 'cv::VideoCapture' to 'CvCapture*' for argument '1' to 'IplImage* cvQueryFrame(CvCapture*)
//CvCapture* capture2 = cvCaptureFromCAM(-1);
// but when i use the CvCapture* capture2, it does not recognize capture2.isOpend() and capture2.get(CV_CAP_PROP_FRAME_COUNT) etc. don't work.
// Is there any way to convert VideoCapture to CvCapture*?
if (!capture.isOpened())
{
QMessageBox msgBox;
msgBox.exec(); // some messagebox message. not important actually
}
cvNamedWindow( name );
IplImage* Ximage = cvQueryFrame(capture);
if (!Ximage)
{
QMessageBox msgBox;
msgBox.exec();
}
double rate= capture.get(CV_CAP_PROP_FPS);
int frames=(int)capture.get(CV_CAP_PROP_FRAME_COUNT);
int frameno=(int)capture.get(CV_CAP_PROP_POS_FRAMES);
bool stop(false);
capture.read(imgsize);
cv::Mat out(imgsize.rows,imgsize.cols,CV_8SC1);
cv::Mat out2(imgsize.rows,imgsize.cols,CV_8SC1);
//I print the frame numbers and the total frames on a label.
ui->label_3->setText(QString::number(frameno/1000)+" / "+QString::number(frames/1000));
ui->label->setScaledContents(true);
ui->label->setPixmap(QPixmap::fromImage(img1)); // here I show the frames on a label.
}
cv::VideoCapture is from the C++ interface of OpenCV, and can be used to capture from a camera device and from a file on the disk
cv::VideoCapture capture1(fileName);
if (!capture.isOpened())
{
// failed, print error message
}
and cvCaptureFromCAM() is a function from the C interface of OpenCV that is used only to capture from a camera device:
CvCapture* capture2 = cvCaptureFromCAM(-1);
if (!capture2)
{
// failed, print error message
}
Don't mix/merge this interfaces together, pick one and stick with it.
If you want to use the C interface to capture from a video file, use cvCaptureFromFile() instead:
CvCapture* capture = cvCaptureFromFile(fileName);
if (!capture)
{
// print error, quit application
}
Check these examples:
Camera capture using the C interface
Camera capture using the C++ interface

AVI created with AVIStreamWrite has incorrect length and playback speed

I'm trying to write to an AVI file using AVIStreamWrite but the resulting avi file is a bit messed up. The images in the avi contain the proper image and colors but the duration and speed of the video is off. I recorded a video that should have been around 7 seconds and looking at the file properties in Windows explorer it showed it had a duration of about 2 seconds. When I played it in Media Player it was too short and seemed to be playing very rapidly (motion in the video was like fast forward). I also can't seem to seek within the video using Media Player.
Here is what I'm doing...
//initialization
HRESULT AVIWriter::Init()
{
HRESULT hr = S_OK;
_hAVIFile = NULL;
_videoStream = NULL;
_frameCount = 0;
AVIFileInit();
::DeleteFileW(_filename);
hr = AVIFileOpen(&_hAVIFile,_filename,OF_WRITE|OF_CREATE, NULL);
if (hr != AVIERR_OK)
{
::cout << "AVI ERROR";
return 0;
}
/**************************************/
// Create a raw video stream in the file
::ZeroMemory(&_streamInfo, sizeof(_streamInfo));
_streamInfo.fccType = streamtypeVIDEO; // stream type
_streamInfo.fccHandler = 0; // No compressor
_streamInfo.dwScale = 1;
_streamInfo.dwRate = _aviFps; //this is 30
_streamInfo.dwSuggestedBufferSize = 0;
_streamInfo.dwSampleSize = 0;
SetRect( &_streamInfo.rcFrame, 0, 0,_bmi.biWidth , _bmi.biHeight );
hr = AVIFileCreateStream( _hAVIFile, // file pointer
&_videoStream,// returned stream pointer
&_streamInfo); // stream header
hr = AVIStreamSetFormat(_videoStream, 0,
&_bmi,
sizeof(_bmi));
return hr;
}
//call this when I receive a frame from my camera
HRESULT AVIWriter::AddFrameToAVI(BYTE* buffer)
{
HRESULT hr;
long size = _bmi.biHeight * _bmi.biWidth * 3;
hr = AVIStreamWrite(_videoStream, // stream pointer
_frameCount++, // time of this frame
1, // number to write
buffer, // pointer to data
size,// size of this frame
AVIIF_KEYFRAME, // flags....
NULL,
NULL);
return hr;
}
//call this when I am done
void AVIWriter::CloseAVI()
{
AVIStreamClose(_videoStream);
AVIFileClose(_hAVIFile);
AVIFileExit();
}
Now as a test I used DirectShow's GraphEdit to create a graph consisting of a VideoCapture Filter for this same camera and an AVI mux and created an avi file. The resulting AVI file was fine. The frame rate was 30 fps, the same that I am using. I queried both avi files (my 'bad' one and the 'good' one created with GraphEdit) using a call to AVIStreamInfo and the stream info was pretty much the same for both files. I would have expected either the samples per second or number of frames to be way off for my 'bad' avi but it wasn't. Am I doing something wrong that would cause my AVI to have the incorrect length and seem to play back at an increased speed?? I'm new to using VFW so any help is appreciated. Thanks
Frame time in the will will eventually be _frameCount / _aviFps, so it is either you are dropping your frames and they don't reach AVIStreamWrite, or alternatively if you prefer to skip a few frames in the file, you need to increment _frameCount respectively, to jump over frames to skip.