I have created a very simple UI using Qt which consists of a simple button and a label. When the button's clicked() signal is emitted, a function which captures a frame from a webcam using OpenCV is called. The code I am currently using to achieve this is:
cv::Mat MainWindow::captureFrame(int width, int height)
{
//sets the width and height of the frame to be captured
webcam.set(CV_CAP_PROP_FRAME_WIDTH, width);
webcam.set(CV_CAP_PROP_FRAME_HEIGHT, height);
//determine whether or not the webcam video stream was successfully initialized
if(!webcam.isOpened())
{
qDebug() << "Camera initialization failed.";
}
//attempts to grab a frame from the webcam
if (!webcam.grab()) {
qDebug() << "Failed to capture frame.";
}
//attempts to read the grabbed frame and stores it in frame
if (!webcam.read(frame)) {
qDebug() << "Failed to read data from captured frame.";
}
return frame;
}
After a frame has been captured, it must be converted into a QImage in order to be displayed in the label. In order to achieve this, I use the following method:
QImage MainWindow::getQImageFromFrame(cv::Mat frame) {
//converts the color model of the image from RGB to BGR because OpenCV uses BGR
cv::cvtColor(frame, frame, CV_RGB2BGR);
return QImage((uchar*) (frame.data), frame.cols, frame.rows, frame.step, QImage::Format_RGB888);
}
The constructor for my MainWaindow class looks like this:
MainWindow::MainWindow(QWidget *parent) :
QMainWindow(parent),
ui(new Ui::MainWindow)
{
ui->setupUi(this);
resize(1280, 720);
move(QPoint(200, 200));
webcam.open(0);
fps = 1000/25;
qTimer = new QTimer(this);
qTimer->setInterval(fps);
connect(qTimer, SIGNAL(timeout()), this, SLOT(displayFrame()));
}
The QTimer is supposed to display a frame by calling dislayFrame()
void MainWindow::displayFrame() {
//capture a frame from the webcam
frame = captureFrame(640, 360);
image = getQImageFromFrame(frame);
//set the image of the label to be the captured frame and resize the label appropriately
ui->label->setPixmap(QPixmap::fromImage(image));
ui->label->resize(ui->label->pixmap()->size());
}
each time its timeout() signal is emitted.
However, while this appears to work to a degree, what actually happens is that the video capture stream from my webcam (a Logitech Quickcam Pro 9000) repeatedly opens and closes. This is evidenced by the fact that the blue ring, which indicates that the webcam is on, repeatedly flashes on and off. This leads to the refresh rate for the webcam video stream label to be very low, and is not desirable. Is there some way to make it so that the webcam stream remains open in order to prevent this "flickering" from occurring?
I seem to have solved the problem of the webcam stream opening and closing by removing the lines:
webcam.set(CV_CAP_PROP_FRAME_WIDTH, width);
webcam.set(CV_CAP_PROP_FRAME_HEIGHT, height);
from the captureFrame() function and setting the width and height of the frame to be captured in the MainWindow constructor.
Related
I use QLabel to show frame captured by camera.
//subThread: capture frame from camera
...
m_videoCapture = new cv::VideoCapture(pipeline, CAP_GSTREAMER);
while(m_videoCapture.isOpened())
{
m_videoCapture->read(m_frame);
if (m_frame)
{
//callback method
m_funcFrame(m_frame, pUser);
}
}
//MainThread
//callback method--show image on QLabel
showImage(cv::Mat& frame)
{
m_image = QImage(frame.data, frame.cols, frame.rows, frame.step, QImage::Format_RGB888);
QPixmap pixmap = QPixmap::fromImage(m_image).copy();
m_pixmap = pixmap.scaled(width, height, Qt::IgnoreAspectRatio, Qt::SmoothTransformat).copy;
ui->m_label->setPixmap(m_pixmap);
}
it work fine but QLabel's pixmap don't update any more on matter how I move camera after 5 or 6 hours. I have to restart my application to let it back to normal.
I want to show the proccess of filling an area line by line. The problem is: even if I create QPixmap from QImage every time I draw a new line (and also delay) and add it to QGraphicsScene in a loop, it doesn't update until the whole thread is finished.
QTest::qWait does the trick. And using QGraphicsPixmapItem for an actual display of the proccess instead of QPixmap itself. I use QPainter initialized from QPixmap and then just reset the pixmap for the pixmap item.
QImage img = QImage(WIDTH, HEIGHT, QImage::Format_RGB32);
img.fill(Qt::white);
QPixmap *pixmapFill = new QPixmap(QPixmap::fromImage(img));
QGraphicsPixmapItem pixmapItem(*pixmapFill);
fillScene->addItem(&pixmapItem);
pixmapItem.setPixmap(*pixmapFill);
// ...
for (auto point : *points) {
QTest::qWait(delay);
painter.drawLine(point.x(), point.y(), xBarrier, point.y());
pixmapItem.setPixmap(*pixmapFill);
}
I am trying to display a video in QLabet in QT Creator. I am reading video using openCV. Here is my code:
mainwindow.cpp
#include "includes.h"
#include "vidreadthread.h"
MainWindow::MainWindow(QWidget *parent) :
QMainWindow(parent),
ui(new Ui::MainWindow)
{
ui->setupUi(this);
VidReadThread *thread1 = new VidReadThread("Video read thread");
thread1->start();
}
MainWindow::~MainWindow()
{
delete ui;
}
vidreadthread.cpp
#include "vidreadthread.h"
#include "includes.h"
using namespace cv;
extern MainWindow *mainPtr;
VidReadThread::VidReadThread(QString s) : name(s)
{
}
void VidReadThread::run()
{
QThread::msleep(100);
VideoCapture cap;
cap.open("helicopter_with_stickers.mp4");
while(1)
{
Mat image1;
// Capture frame-by-frame
cap >> image1;
// If the frame is empty, break immediately
if (image1.empty())
break;
QImage image2 = QImage((uchar*) image1.data, image1.cols, image1.rows, image1.step, QImage::Format_RGB888);
mainPtr->ui->label1->setPixmap(QPixmap::fromImage(image2));
}
}
I am able to display the video but I can't set frame rate. Whole 60sec of video gets over in 4-5 frames. With only OpenCV I have command on frame rate with cvWaitkey(), but here msleep() doesn't seem to be working for a similar application. Please suggest a way to do so without frame skipping. I made a vidreadthread so that GUI doesn't get hanged while video is being read.
If there is any other way by which I can display OpenCV window inside my QT UI, then please recommend that as well.
try moveto thread may works better
.cpp
for (int i = 0; i<Camera::getCameraCount();)
ui->comboBox->addItem(QString::number(i++)); //name camera
camera = new Camera();
camera->moveToThread(&thread);
connect(this, SIGNAL(cameraOperate(int)), camera, SLOT(Operate(int)));
connect(camera, SIGNAL(updateImage(QImage)), this, SLOT(updateImage(QImage)));
void app0::updateImage(QImage image)
{
ui->videoviewer->setPixmap(QPixmap::fromImage(image));
}
camera thread:
void Camera::Operate(int _index)
{
if (open(_index)) { qDebug() << "Camera open success!";}
else { qDebug() << "Camera open failed!"; return; }
if (capture.get(28) == -1) { cout << "get 28 -1" << "\n";emit }
while (1)
{
qApp->processEvents();
Mat matin = read(); //read mat
matnow = matin;
QImage image = Mat2QImage(matin);
emit updateImage(image);
}
}
link:https://blog.csdn.net/Sun_tian/article/details/104236327
I'm capturing webcam streams from two raspberry pis and I'm trying to carry out some image processing on both streams. I have two Qlabels I'm trying to use to display the images from the pis. However, whilst one stream displays in real time, the other has a 4-5 second delay. The same result occurs if I try to display one stream on both Qlabel objects. Is this a threading issue? Can you guys please help out?
VideoCapture capWebcam;
VideoCapture EyeintheSky;
Mat matEyeInTheSky;
QImage qimgEyeInTheSky;
Mat matOriginal;
QImage qimgOriginal;
MainWindow::MainWindow(QWidget *parent) :
QMainWindow(parent),
ui(new Ui::MainWindow)
{
ui->setupUi(this);
capWebcam.open("http://192.168.0.102:8080/?action=stream?dummy=param.mjpg"); //from MJPG STreamer image processing
EyeintheSky.open("http://192.168.0.100:8080/?action=stream?dummy=param.mjpg");
if (capWebcam.isOpened() == false) {
return;
}
if(EyeintheSky.isOpened() == false) {
return;
}
}
void MainWindow::processFrameAndUpdateGUI() {
capWebcam.read(matOriginal);
EyeintheSky.read(matEyeInTheSky);
if(matOriginal.empty() == true) {
qDebug() << "Empty Picture";
return;
}
else {
// start of visual processing
// Output Tri Track images to screen
// map QImage to QLabel
cvtColor(matOriginal,matOriginal,COLOR_BGR2RGB);
QImage qimgOriginal((uchar*)matOriginal.data,matOriginal.cols,matOriginal.rows, matOriginal.step,QImage::Format_RGB888);
ui->lblInputImage->setPixmap(QPixmap::fromImage(qimgOriginal));
// Output Eye in the Sky to screen
// map QImage to QLabel
cvtColor(matEyeInTheSky, matEyeInTheSky, COLOR_BGR2RGB);
QImage qimgEyeInTheSky((uchar*)matEyeInTheSky.data, matEyeInTheSky.cols, matEyeInTheSky.rows, matEyeInTheSky.step, QImage::Format_RGB888);
ui->sky_input->setPixmap(QPixmap::fromImage(qimgEyeInTheSky));
// Process IK code.
}
}
I have QWidget (named as screenshotLabel) and continuously it's content is changing.I can get that lable content to qpixmap (named as originalPixmap) as bellow.
originalPixmap = QPixmap();
QPixmap pixmap(screenshotLabel->size());
this->render(&pixmap);
originalPixmap = pixmap;
Now I want to save it as a video file.But I could not able to do it.How can I save QWidget content as a video file?
I found a way to generate video using OpenCV VideoWriter.I leave comments in the code that describe what is happening.
originalPixmap = pixmap;
qImageSingle = originalPixmap.toImage(); // Convert QPixmap to QImage
// Get QImage data to Open-cv Mat
frame = Mat(qImageSingle.height(), qImageSingle.width(), CV_8UC3, qImageSingle.bits(), qImageSingle.bytesPerLine()).clone();
namedWindow("MyVideo", CV_WINDOW_AUTOSIZE);
imshow("MyVideo", frame);
vector<int> compression_params;
compression_params.push_back(CV_IMWRITE_PNG_COMPRESSION);
compression_params.push_back(9);
try {
imwrite("alpha2.png", frame, compression_params);
VideoWriter video("out2.avi", CV_FOURCC('M','J','P','G'), 10, Size(qImageSingle.width(), qImageSingle.height()), true);
for(int i=0; i<100; i++){
video.write(frame); // Write frame to VideoWriter
}
}
catch (runtime_error& ex) {
fprintf(stderr, "Exception converting image to PNG format: %s\n", ex.what());
}