I have a program that records video from a web camera. It shows the camera view in the form. When start button clicked it should start recording video and should be stopped after pressing stop button. Program compiles fine but no video is recorded. Can anyone say what is the wrong with it?
Here is my code.
{
camera = new QCamera(this);
viewFinder = new QCameraViewfinder(this);
camera->setViewfinder(viewFinder);
recorder = new QMediaRecorder(camera,this);
QBoxLayout *layout = new QVBoxLayout;
layout->addWidget(viewFinder);
ui->widget->setLayout(layout);
QVideoEncoderSettings settings = recorder->videoSettings();
settings.setResolution(640,480);
settings.setQuality(QMultimedia::VeryHighQuality);
settings.setFrameRate(30.0);
//settings.setCodec("video/mp4");
recorder->setVideoSettings(settings);
recorder->setContainerFormat("mp4");
camera->setCaptureMode(QCamera::CaptureVideo);
camera->start();
}
void usbrecorder::on_btn_Record_clicked()
{
usbrecorder::startRecording();
}
void usbrecorder::on_btn_Stop_clicked()
{
usbrecorder::stopRecording();
}
void usbrecorder::startRecording()
{
recorder->setOutputLocation(QUrl::fromLocalFile("C:\\Users\\Stranger\\Downloads\\Video\\vidoe_001.mp4"));
recorder->record();
}
void usbrecorder::stopRecording()
{
recorder->stop();
}
This is due to limitations on Windows.
As mentioned in Qt documentation here: https://doc.qt.io/qt-5/qtmultimedia-windows.html#limitations
Video recording is currently not supported. Additionally, the DirectShow plugin does not support any low-level video functionality such as monitoring video frames being played or recorded using QVideoProbe or related classes.
You need to specify an output location:
QMediaRecorder::setOutputLocation(const QUrl& location)
e.g.
setOutputLocation(QUrl("file:///home/user/vid.mp4"));
Try to print the state, status and error message:
qDebug()<<record.state();
qDebug()<<record.status();
qDebug()<<record.error();
and see what it prints. With those messages you can have a clear picture of your problem. Maybe QMediaRecorder cannot access your camera.
Related
I using visual studio 2019 qt5.12.3
I use ffmpeg.exe to stream my pc screen and show it inside my app.
Its playing the video stream but if my stream down or stop and ill try to
lets say i start ffmpeg.exe and i can see my screen in my app, then ill kill the ffmpeg.exe and try to delete or reload new url its will crush or stuck
delete QMediaPlayer;
or
QMediaPlayer->deleteLater();
and not even
QMediaPlayer->setMedia();
working if the stream stop
its will crush the app or nor respond.
this is my code to play the stream
void Stream::LoadStream() {
QUrl StreamUrl = QUrl("https://example.com:2083/live/stream.flv");
vWidget = new QVideoWidget(this);
vWidget->setObjectName("vWidget");
vPlayer = new QMediaPlayer(this, QMediaPlayer::StreamPlayback);
vPlayer->setObjectName("vPlayer");
vPlayer->setVideoOutput(vWidget);
vPlayer->setMedia(StreamUrl);
vPlayer->play();
vWidget->show();
}
void Stream::StopStream() {
vWidget->deleteLater();
//vPlayer->setPosition(0); //tried but not helped
vPlayer->stop();
vPlayer->deleteLater(); // this crush the app if StreamUrl stoped streaming ..
delete vPlayer; // this crush the app if StreamUrl stoped streaming ..
}
void Stream::ChangeUrl(QString Url) {
vPlayer->setMedia(QUrl(Url););
}
how i can delete or remove the player ?
In a GUI I'm working on I intend to use VLC (axVLCplguin) to display camera feed. I want to test it with an offline video stored on my disc, however, each time I click on the button, which should start the video, I get a runtime error (caused on line VLC -> playlist -> add(path, NULL, NULL); ).
I assume, that the given MRL address - the first parameter of the method - is not in a correct format: file://C:/Users/User/Desktop/AMT.mp4
(viz.: https://wiki.videolan.org/Media_resource_locator/).
I've seen many tutorials, how to get a video feed in WinForms using the VLC plugin, but none of them had an included code in C++ (only in C#, where the simple absolute path to the video did the work).
void Grimr::MainForm::chbox_live_CheckedChanged(System::Object^ sender, System::EventArgs^ e) {
pbox_drawing->Visible = false;
if (chbox_area->Checked)
{
VLC->Visible = true;
VLC->AutoPlay = false;
String^ path = "C:\\Users\\User\\Desktop\\AMT.mp4";
VLC->playlist->add(path, NULL, NULL);
VLC->playlist->play();
}
}
I've got a video conference website using aws-chime-sdk-js, and I've got a button that stops the current meeting. The problem is that I can't get the webcam to stop showing itself as recording (led light and red icon), even after following these function calls outlined below:
https://aws.github.io/amazon-chime-sdk-js/modules/faqs.html#after-leaving-a-meeting-the-camera-led-is-still-on-indicating-that-the-camera-has-not-been-released-what-could-be-wrong
const stop = async (meetingId) => {
try {
const response = await API.post("chime", "/chime/end", {
body: { meetingId },
});
console.log(response);
// Select no video device (releases any previously selected device)
meetingSession.audioVideo.chooseVideoInputDevice(null);
// Stop local video tile (stops sharing the video tile in the meeting)
meetingSession.audioVideo.stopLocalVideoTile();
meetingSession.audioVideo.stop();
} catch (e) {
console.log(e);
}
};
I've even tried releasing the tracks individually with getUserMedia to no avail. Any ideas how to turn off the webcam?
You could reload the window when your component is unmounting, or right after you end the meeting. The action of reloding the page turns off the cam that was already turned on. For example, using javascript.
window.location.reload();
I am working on a Universal Windows Platform Application (UWP) in which I am using C++ as the main language. I want to read from two cameras at the same time. One camera belongs to the Kinect RGB camera and the other to the Kinect Depth camera. So far I've managed to read from just one using this piece of code:
void SDKTemplate::Scenario4_ReproVideo::Grabar_Click(Platform::Object^
sender, Windows::UI::Xaml::RoutedEventArgs^ e)
{
CameraCaptureUI^ dialog = ref new CameraCaptureUI();
dialog->VideoSettings->Format = CameraCaptureUIVideoFormat::Mp4;
Windows::Foundation::Collections::IPropertySet^ appSettings = ApplicationData::Current->LocalSettings->Values;
concurrency::task<StorageFile^>(dialog->CaptureFileAsync(CameraCaptureUIMode::Video)).then([this](StorageFile^ file) {
if (file != nullptr) {
concurrency::task<Streams::IRandomAccessStream^> (file->OpenAsync(FileAccessMode::Read)).then([this] (Streams::IRandomAccessStream^ stream){
CapturedVideo->SetSource(stream, "video/mp4");
logger->Text = "recording";
});
Windows::Foundation::Collections::IPropertySet^ appSettings = ApplicationData::Current->LocalSettings->Values;
appSettings->Insert("CapturedVideo", PropertyValue::CreateString(file->Path));
}
else {
logger->Text = "Something went wrong or was cancelled";
}
});
}
By doing this I can reliably record from one of the cameras. My problem is that I need to record from both cameras at the same time as I need the Depth and RGB to process the video.
I am new to concurrency, is there a way (the simpler the better) to achieve two recordings simultaneously?
In UWP app, we can capture photos and video using the MediaCapture class which provides functionality for capturing photos, audio, and videos from a capture device. See the topic Basic photo, video, and audio capture with MediaCapture.
We can initialize multiple MediaCapture instances then read frame by using MediaFrameReader Class. see the topic Discover and select camera capabilities with camera profiles and Process media frames with MediaFrameReader and also look into the official sample CameraFrames.
Besides, there is a similar thread about UWP multiple camera capture, you can also refer it:
Handle multiple camera capture UWP
I am currently writing an application in Qt, which is basically a warehouse. An application reads CSV, enables user to process it and enables to show picture of each good. I tried displaying picture using QLabel and Pixmap, however nothing happens even though the file is in the same folder and the name provided is exactly as it should be. Is it the resources issue or my code fails somehow? Is there any possibility to display the image without adding it to resources in order to avoid adding many photos manually?
void ImageViewer::viewImage(QString imgName)
{
QString pathWithName = imgName;
pathWithName.append(".jpg");
ui->label->setPixmap( QPixmap(pathWithName) );
ui->label->show();
update();
}
Sorry for any mistakes in post creation or code displaying here- it's my first post.
Edit:
I am adding code from MainWindow (called CsvReader in my project) to how I'm invoking the method viewImage:
void CsvReader::on_imgView_clicked()
{
ImageViewer* img = new ImageViewer(this);
img->setModal(true);
img->exec();
QModelIndex List selInd ui->tableView->selectionModel()->selectedIndexes();
QString id = model->item(selInd.first().row(), 0)->text();
img->viewImage(id);
}
Edit 2:
Solved. Had to change path using QDir:
QDir* directory = new QDir("/home/kokos/Magazyn/photos");
QFileInfo checkFile(*directory, pathWithName);
Thanks in advance,
Kokos
Confirm your file's location and existence first. Add this;
QFileInfo checkFile(pathWithName);
if (checkFile.exists() && checkFile.isFile()) {
// your code
}