i try to read .avi or .mpeg video file using VideoCapture class of OpenCV2.4.8 in C++ using QtCreator and CMake 2.8.12.1.
Before built OpenCV, i download FFMPEG static version and put them into Program Files directory, i add their path into enviroments variable PATH, then i download K Lite Codec Pack Full, install it and only then i built OpenCV with CMAKE and mingw provide by Qt. After installation i add to PATH the right path of built OpenCV.
The stream from webcam works fine, but the stream from a video file doesn't work. I tried on Windows 7 32bit and Windows 8 64bit.
Here is the code
#include <opencv2/imgproc/imgproc.hpp>
#include <opencv2/highgui/highgui.hpp>
#include <opencv2/core/core.hpp>
#include "opencv2/opencv.hpp"
cv::Mat img;
cv::VideoCapture cap("Prova.avi");
std::cerr << cap.isOpened() << std::endl;
while(cap.read(img)){
cv::imshow("Opencv", img);
cv::waitKey(33);
}
The same code works on Ubuntu 12.04 with the same version of OpenCV and with the ffmpeg build by myself.
What is wrong?
Download the file opencv_ffmpeg.dll. I got it from this
Now rename the opencv_ffmpeg.dll according to the opencv version you are using. For OpenCv 2.4.8 rename it as opencv_ffmpeg248.dll and add the location of this file to the PATH in environment variable. Now try to run your program.
For some unknown reason in a system 32bit the building created a dll called opencv_ffmpeg248_64.dll and the installation of opencv was in a folder "x64", so i think that dll was for 64bit system. I download the pre built version of openCV and i use the dll included in that package.
Related
I'm trying to output a video from file using OpenCV 3.4.0. Xcode gives an error message: OpenCV: Couldn't read video stream from file. When i capture video from cam xcode run successfull. I try install ffmpeg 3.4.1 and don't know what headers i need to include. It's all on OSX 10.12 Sierra
Found solution, i point wrong path to file "~/Desktop/test.mp4" just changed to "/Users/jameson/Desktop/test.mp4" apparantly opencv not enough abridged version of path
I'm trying to get VideoCapture working with OpenCV. The video I'm trying to load is in XVID format (checked it with VideoCapture::get(CV_CAP_PROP_FOURCC)). It works fine, but whenever I try to get the video framerate (VideoCapture::get(CV_CAP_PROP_FPS)) I get -nan.
I've used the same video and the same code on another computer (at uni, they have a custom Debian installation) and I can confirm that the framerate info is there (it works fine there). I read somewhere that Ubuntu recently removed ffmpeg from their repositories (I use Linux Mint 17.2), so I installed the ffmpeg package from the ppa:kirillshkrogalev/ffmpeg-next repositories. After that I recompiled OpenCV and installed again, without any change.
I'm using OpenCV 2.4.11 with C++ under Linux Mint 17.2.
Probably XVID is missing on your linux machine which is required by ffmpeg. Try installing XVID as below, it may help you.
cd /opt
wget http://downloads.xvid.org/downloads/xvidcore-1.3.2.tar.gz
tar xzvf xvidcore-1.3.2.tar.gz
cd xvidcore/build/generic
./configure --prefix="$HOME/ffmpeg_build"
make
make install
How do I cross compile my opencv C++ program for ARM?
I have successfully compiled opencv for ARM, But I haven't run
make install
my understanding is that doing this will move the compiled opencv in my system directories. I already have opencv built and installed for my system. Will installing opencv compiled for ARM mess up my native opencv?
When I compile opencv project for my system I specify the opencv Library Path and header directories with -L and -I options. I want to compile the same project for arm, how do i go about doing this.
Doing a simple
make install
installs the OpenCV in the build directory inside a 'install' folder.
Works only for opencv versions above 2.4.x
I'm trying to compile a C++ program that uses OpenCV to score the similarity of two images:
Image Histogram Compare
When I g++ compile the file:
'opencv2/imgcodecs.hpp' file not found
#include "opencv2/imgcodecs.hpp"
I updated the opencv formula on Mac OS X. I re-ran brew install, but I still get this error.
mdfind imgcodecs.hpp -name
returns nothing - the file is nowhere on my system.
Does anyone know my imgcodecs is not included, and how to include it? I'm really novice at C and OpenCV, and enormously grateful for any help.
I notice that you are looking at tutorial code on the master branch (aka 3.0.0). It uses an imgcodecs module that is not present in earlier versions of OpenCV (e.g. 2.4.9).
Check which version of OpenCV you have (it seems not 3.0.0) and use a matching version of tutorial code (e.g. OpenCV 2.4.9 Histograms_Matching/EqualizeHist_Demo.cpp)
i'm using matlab 2013a (win7 64bit).
install opencv on C:\OpenCV-2.4.7
how we connect Open Cv libraries into MATLAB's ?
Quick start -->
http://groups.inf.ed.ac.uk/calvin/calvin_upperbody_detector/downloads/README.html
e.g.
from the bash command line execute:
LD_LIBRARY_PATH="path_to_opencv_lib:$LD_LIBRARY_PATH"
I'm using mexopencv. It easy to use and well documented. But if you want to get the real OpenCV MATLAB wrapper, you can only get it from the GitHub and not with the .exe
The OpenCV Matlab module