Displaying multiple OpenCV VideoCapture objects in QT - c++

I'm capturing webcam streams from two raspberry pis and I'm trying to carry out some image processing on both streams. I have two Qlabels I'm trying to use to display the images from the pis. However, whilst one stream displays in real time, the other has a 4-5 second delay. The same result occurs if I try to display one stream on both Qlabel objects. Is this a threading issue? Can you guys please help out?
VideoCapture capWebcam;
VideoCapture EyeintheSky;
Mat matEyeInTheSky;
QImage qimgEyeInTheSky;
Mat matOriginal;
QImage qimgOriginal;
MainWindow::MainWindow(QWidget *parent) :
QMainWindow(parent),
ui(new Ui::MainWindow)
{
ui->setupUi(this);
capWebcam.open("http://192.168.0.102:8080/?action=stream?dummy=param.mjpg"); //from MJPG STreamer image processing
EyeintheSky.open("http://192.168.0.100:8080/?action=stream?dummy=param.mjpg");
if (capWebcam.isOpened() == false) {
return;
}
if(EyeintheSky.isOpened() == false) {
return;
}
}
void MainWindow::processFrameAndUpdateGUI() {
capWebcam.read(matOriginal);
EyeintheSky.read(matEyeInTheSky);
if(matOriginal.empty() == true) {
qDebug() << "Empty Picture";
return;
}
else {
// start of visual processing
// Output Tri Track images to screen
// map QImage to QLabel
cvtColor(matOriginal,matOriginal,COLOR_BGR2RGB);
QImage qimgOriginal((uchar*)matOriginal.data,matOriginal.cols,matOriginal.rows, matOriginal.step,QImage::Format_RGB888);
ui->lblInputImage->setPixmap(QPixmap::fromImage(qimgOriginal));
// Output Eye in the Sky to screen
// map QImage to QLabel
cvtColor(matEyeInTheSky, matEyeInTheSky, COLOR_BGR2RGB);
QImage qimgEyeInTheSky((uchar*)matEyeInTheSky.data, matEyeInTheSky.cols, matEyeInTheSky.rows, matEyeInTheSky.step, QImage::Format_RGB888);
ui->sky_input->setPixmap(QPixmap::fromImage(qimgEyeInTheSky));
// Process IK code.
}
}

Related

Save a captured image using QCameraImageCapture::capture() in PNG Format

this is literally my first question in a forum.
So I'm a Qt newbie and I'm stuck at this little detail.
I'm creating this application that takes pictures and saves them, but the issue is that it saves then in a "JPEG" format and i need them in "PNG" or "GIF" or "tiff" and i I've tried a lot of stuff but nothing worked, so here's my code :
MainWindow::MainWindow(QWidget *parent)
: QMainWindow(parent)
, ui(new Ui::MainWindow)
{
ui->setupUi(this);
_camera_view = new QCameraViewfinder();
_take_image_button = new QPushButton("Take Image");
_turn_camera_off = new QPushButton("Turn Off");
_turn_camera_on= new QPushButton("Turn On");
_central_widget = new QWidget();
setCentralWidget(_central_widget);
_setup_ui();
_setup_camera_devices();
set_camera(QCameraInfo::defaultCamera());
connect(_take_image_button, &QPushButton::clicked, [this]{
_image_capture.data()->capture();});
connect(_turn_camera_off, &QPushButton::clicked, [this]{_camera.data()->stop();});
connect(_turn_camera_on, &QPushButton::clicked, [this]{_camera.data()->start();});}
You should get QImage in some point of your work. It has save member. Example from cited documentation:
QImage image;
QByteArray ba;
QBuffer buffer(&ba);
buffer.open(QIODevice::WriteOnly);
image.save(&buffer, "PNG"); // writes image into ba in PNG format
So the usage in context of QCameraImageCapture goes something like that:
QObject::connect(cap, &QCameraImageCapture::imageCaptured, [=] (int id, QImage img) {
QByteArray ba;
QBuffer buffer(&ba);
buffer.open(QIODevice::WriteOnly);
img.save(&buffer, "PNG");
});
For anyone that might encounter this problem in the future, here's the solution i've found :
_image_capture->setCaptureDestination(QCameraImageCapture::CaptureToBuffer);
QObject::connect(_image_capture.data(), &QCameraImageCapture::imageCaptured, [=] (int id, QImage img) {
fileName = "image.png";
path = QStandardPaths::writableLocation(QStandardPaths::PicturesLocation) + "/" + fileName;
img.save(path, "PNG");
});

Qt / OpenCV - Resizing an Image doesn't work with specific sizes

I made an Image Editor in Qt / OpenCV where you can load the Image from the File explorer and grayscale/adaptive threshold/resize it afterwards.
Bug 1: When I resize the Loaded Image to (for example) 600x600 Pixels using my ImageProcessor::Resize(int, int) method, it works fine. But when I change it to like 546x750 Pixels, the Image has a weird grayscale.
Bug 2: When I want to resize my Grayscaled/Thresholded Image, it always gets a weird grayscale similiar to Bug 1.
Codes:
mainwindow.cpp
#include "mainwindow.h"
#include "ui_mainwindow.h"
#include "resizer.h"
MainWindow::MainWindow(QWidget *parent)
: QMainWindow(parent)
, ui(new Ui::MainWindow)
{
ui->setupUi(this);
}
MainWindow::~MainWindow()
{
delete ui;
}
void MainWindow::Display(cv::Mat inputImage)
{
QImage image = QImage(inputImage.data, inputImage.cols, inputImage.rows, QImage::Format_RGB888);
scene->addPixmap(QPixmap::fromImage(image));
ui->graphicsView->setScene(scene);
ui->graphicsView->show();
}
void MainWindow::on_actionOpen_triggered()
{
QString file = QFileDialog::getOpenFileName(this, "Open", "", "Images (*.jpg *.png)");
std::string filename = file.toStdString();
inputImage = cv::imread(filename);
Display(inputImage);
imgProc = new ImageProcessor(inputImage);
}
void MainWindow::on_pushButton_clicked() // Grayscale
{
scene->clear();
imgProc->mode = 1;
inputImage = imgProc->Grayscale();
QImage image = QImage(inputImage.data, inputImage.cols, inputImage.rows, QImage::Format_Grayscale8);
scene->addPixmap(QPixmap::fromImage(image));
ui->graphicsView->setScene(scene);
ui->graphicsView->show();
}
void MainWindow::on_pushButton_2_clicked() // ADT
{
scene->clear();
imgProc->mode = 2;
inputImage = imgProc->AdaptiveThreshold();
QImage image = QImage(inputImage.data, inputImage.cols, inputImage.rows, QImage::Format_Grayscale8);
scene->addPixmap(QPixmap::fromImage(image));
ui->graphicsView->setScene(scene);
ui->graphicsView->show();
}
void MainWindow::on_pushButton_3_clicked() // Resize
{
scene->clear();
Resizer resizer;
resizer.exec();
int newWidth = resizer.GetWidth();
int newHeight = resizer.GetHeight();
inputImage = imgProc->Resize(newWidth, newHeight);
if(imgProc->mode == 1 || imgProc->mode == 2)
{
QImage image = QImage(inputImage.data, inputImage.cols, inputImage.rows, QImage::Format_Grayscale8);
scene->addPixmap(QPixmap::fromImage(image));
ui->graphicsView->setScene(scene);
ui->graphicsView->show();
}
else
{
QImage image = QImage(inputImage.data, inputImage.cols, inputImage.rows, QImage::Format_RGB888);
scene->addPixmap(QPixmap::fromImage(image));
ui->graphicsView->setScene(scene);
ui->graphicsView->show();
}
}
imageprocessor.cpp
#include "imageprocessor.h"
ImageProcessor::ImageProcessor(cv::Mat inputImage)
{
this->inputImage = inputImage;
}
cv::Mat ImageProcessor::Resize(int width, int height)
{
cv::Mat resized;
cv::resize(inputImage, resized, cv::Size(width, height), cv::INTER_LINEAR);
return resized;
}
cv::Mat ImageProcessor::Grayscale()
{
cv::Mat grayscaled;
cv::cvtColor(inputImage, grayscaled, cv::COLOR_RGB2GRAY);
return grayscaled;
}
cv::Mat ImageProcessor::AdaptiveThreshold()
{
cv::Mat binarized, grayscaled;
cv::cvtColor(inputImage, grayscaled, cv::COLOR_RGB2GRAY);
cv::adaptiveThreshold(grayscaled, binarized, 255, cv::ADAPTIVE_THRESH_GAUSSIAN_C, cv::THRESH_BINARY, 15, 11);
return binarized;
}
QImage::Format_RGB888 is the format type you defined means that:
The image is stored using a 24-bit RGB format (8-8-8).
If your image has 3 channels then your way is correct to continue, except adding this:
QImage image = QImage(inputImage.data, inputImage.cols, inputImage.rows, QImage::Format_RGB888).rgbSwapped();
You need to add at the end rgbSwapped() because Qt reads it in RGB order as OpenCV gives BGR.
If you want to send a gray scale image to GUI then you need to use QImage::Format_Grayscale8 format type, which means:
The image is stored using an 8-bit grayscale format.
Here is the clear cocumentation for the formats.
Note: How do you resize your image, by using OpenCV function ? Share resizer.h , I will update the answer accordingly.

Displaying video in QLabel

I am trying to display a video in QLabet in QT Creator. I am reading video using openCV. Here is my code:
mainwindow.cpp
#include "includes.h"
#include "vidreadthread.h"
MainWindow::MainWindow(QWidget *parent) :
QMainWindow(parent),
ui(new Ui::MainWindow)
{
ui->setupUi(this);
VidReadThread *thread1 = new VidReadThread("Video read thread");
thread1->start();
}
MainWindow::~MainWindow()
{
delete ui;
}
vidreadthread.cpp
#include "vidreadthread.h"
#include "includes.h"
using namespace cv;
extern MainWindow *mainPtr;
VidReadThread::VidReadThread(QString s) : name(s)
{
}
void VidReadThread::run()
{
QThread::msleep(100);
VideoCapture cap;
cap.open("helicopter_with_stickers.mp4");
while(1)
{
Mat image1;
// Capture frame-by-frame
cap >> image1;
// If the frame is empty, break immediately
if (image1.empty())
break;
QImage image2 = QImage((uchar*) image1.data, image1.cols, image1.rows, image1.step, QImage::Format_RGB888);
mainPtr->ui->label1->setPixmap(QPixmap::fromImage(image2));
}
}
I am able to display the video but I can't set frame rate. Whole 60sec of video gets over in 4-5 frames. With only OpenCV I have command on frame rate with cvWaitkey(), but here msleep() doesn't seem to be working for a similar application. Please suggest a way to do so without frame skipping. I made a vidreadthread so that GUI doesn't get hanged while video is being read.
If there is any other way by which I can display OpenCV window inside my QT UI, then please recommend that as well.
try moveto thread may works better
.cpp
for (int i = 0; i<Camera::getCameraCount();)
ui->comboBox->addItem(QString::number(i++)); //name camera
camera = new Camera();
camera->moveToThread(&thread);
connect(this, SIGNAL(cameraOperate(int)), camera, SLOT(Operate(int)));
connect(camera, SIGNAL(updateImage(QImage)), this, SLOT(updateImage(QImage)));
void app0::updateImage(QImage image)
{
ui->videoviewer->setPixmap(QPixmap::fromImage(image));
}
camera thread:
void Camera::Operate(int _index)
{
if (open(_index)) { qDebug() << "Camera open success!";}
else { qDebug() << "Camera open failed!"; return; }
if (capture.get(28) == -1) { cout << "get 28 -1" << "\n";emit }
while (1)
{
qApp->processEvents();
Mat matin = read(); //read mat
matnow = matin;
QImage image = Mat2QImage(matin);
emit updateImage(image);
}
}
link:https://blog.csdn.net/Sun_tian/article/details/104236327

Create video file using sequence of QPixmap in QT Creator- C++

I have QWidget (named as screenshotLabel) and continuously it's content is changing.I can get that lable content to qpixmap (named as originalPixmap) as bellow.
originalPixmap = QPixmap();
QPixmap pixmap(screenshotLabel->size());
this->render(&pixmap);
originalPixmap = pixmap;
Now I want to save it as a video file.But I could not able to do it.How can I save QWidget content as a video file?
I found a way to generate video using OpenCV VideoWriter.I leave comments in the code that describe what is happening.
originalPixmap = pixmap;
qImageSingle = originalPixmap.toImage(); // Convert QPixmap to QImage
// Get QImage data to Open-cv Mat
frame = Mat(qImageSingle.height(), qImageSingle.width(), CV_8UC3, qImageSingle.bits(), qImageSingle.bytesPerLine()).clone();
namedWindow("MyVideo", CV_WINDOW_AUTOSIZE);
imshow("MyVideo", frame);
vector<int> compression_params;
compression_params.push_back(CV_IMWRITE_PNG_COMPRESSION);
compression_params.push_back(9);
try {
imwrite("alpha2.png", frame, compression_params);
VideoWriter video("out2.avi", CV_FOURCC('M','J','P','G'), 10, Size(qImageSingle.width(), qImageSingle.height()), true);
for(int i=0; i<100; i++){
video.write(frame); // Write frame to VideoWriter
}
}
catch (runtime_error& ex) {
fprintf(stderr, "Exception converting image to PNG format: %s\n", ex.what());
}

Qt OpenCV Webcam Stream Opening and Closing

I have created a very simple UI using Qt which consists of a simple button and a label. When the button's clicked() signal is emitted, a function which captures a frame from a webcam using OpenCV is called. The code I am currently using to achieve this is:
cv::Mat MainWindow::captureFrame(int width, int height)
{
//sets the width and height of the frame to be captured
webcam.set(CV_CAP_PROP_FRAME_WIDTH, width);
webcam.set(CV_CAP_PROP_FRAME_HEIGHT, height);
//determine whether or not the webcam video stream was successfully initialized
if(!webcam.isOpened())
{
qDebug() << "Camera initialization failed.";
}
//attempts to grab a frame from the webcam
if (!webcam.grab()) {
qDebug() << "Failed to capture frame.";
}
//attempts to read the grabbed frame and stores it in frame
if (!webcam.read(frame)) {
qDebug() << "Failed to read data from captured frame.";
}
return frame;
}
After a frame has been captured, it must be converted into a QImage in order to be displayed in the label. In order to achieve this, I use the following method:
QImage MainWindow::getQImageFromFrame(cv::Mat frame) {
//converts the color model of the image from RGB to BGR because OpenCV uses BGR
cv::cvtColor(frame, frame, CV_RGB2BGR);
return QImage((uchar*) (frame.data), frame.cols, frame.rows, frame.step, QImage::Format_RGB888);
}
The constructor for my MainWaindow class looks like this:
MainWindow::MainWindow(QWidget *parent) :
QMainWindow(parent),
ui(new Ui::MainWindow)
{
ui->setupUi(this);
resize(1280, 720);
move(QPoint(200, 200));
webcam.open(0);
fps = 1000/25;
qTimer = new QTimer(this);
qTimer->setInterval(fps);
connect(qTimer, SIGNAL(timeout()), this, SLOT(displayFrame()));
}
The QTimer is supposed to display a frame by calling dislayFrame()
void MainWindow::displayFrame() {
//capture a frame from the webcam
frame = captureFrame(640, 360);
image = getQImageFromFrame(frame);
//set the image of the label to be the captured frame and resize the label appropriately
ui->label->setPixmap(QPixmap::fromImage(image));
ui->label->resize(ui->label->pixmap()->size());
}
each time its timeout() signal is emitted.
However, while this appears to work to a degree, what actually happens is that the video capture stream from my webcam (a Logitech Quickcam Pro 9000) repeatedly opens and closes. This is evidenced by the fact that the blue ring, which indicates that the webcam is on, repeatedly flashes on and off. This leads to the refresh rate for the webcam video stream label to be very low, and is not desirable. Is there some way to make it so that the webcam stream remains open in order to prevent this "flickering" from occurring?
I seem to have solved the problem of the webcam stream opening and closing by removing the lines:
webcam.set(CV_CAP_PROP_FRAME_WIDTH, width);
webcam.set(CV_CAP_PROP_FRAME_HEIGHT, height);
from the captureFrame() function and setting the width and height of the frame to be captured in the MainWindow constructor.