How to fix the insufficient memory error (openCV) - c++

Please help how to handle this problem:
OpenCV Error: Insufficient memory (Failed to allocate 921604 bytes) in
unknown function, file
........\ocv\opencv\modules\core\src\alloc.cpp, line 52
One of my method using cv::clone and pointer
The code is:
There is a timer every 100ms;
In the timer event, I call this method:
void DialogApplication::filterhijau(const Mat &image, Mat &result) {
cv::Mat resultfilter = image.clone();
int nlhijau = image.rows;
int nchijau = image.cols*image.channels();;
for(int j=0; j<nlhijau; j++) {
uchar *data2=resultfilter.ptr<uchar> (j); //alamat setiap line pada result
for(int i=0; i<nchijau; i++) {
*data2++ = 0; //element B
*data2++ = 255; //element G
*data2++ = 0; //element R
}
// free(data2); //I add this line but the program hung up
}
cv::addWeighted(resultfilter,0.3,image,0.5,0,resultfilter);
result=resultfilter;
}

The clone() method of a cv::Mat performs a hard copy of the data. So the problem is that for each filterhijau() a new image is allocated, and after hundreds of calls to this method your application will have occupied hundreds of MBs (if not GBs), thus throwing the Insufficient Memory error.
It seems like you need to redesign your current approach so it occupies less RAM memory.

I faced this error before, I solved it by reducing the size of the image while reading them and sacrificed some resolution.
It was something like this in Python:
# Open the Video
cap = cv2.VideoCapture(videoName + '.mp4')
i = 0
while cap.isOpened():
ret, frame = cap.read()
if not ret:
break
frame = cv2.resize(frame, (900, 900))
# append the frames to the list
images.append(frame)
i += 1
cap.release()
N.B. I know it's not the most optimum solution for the problem but, it was enough for me.

Related

What causes double free or corruption (out) error?

Possible Duplicate:
OpenCV double free or corruption (out): Aborted (core dumped)
I created a function that will receive image from client and use OpenCV in server to process image and return data.
I have realized that I get this error only when I use function free(). Below is the code in my function.
// 2. Create Mat Image
Mat image = Mat::zeros(height, width, CV_8UC3);
uchar sockData[imageSize];
//Receive Image data here
printf("Receiving Image Data\n");
for (int i = 0; i < imageSize; i += bytecount)
{
if ((bytecount = recv(*csock, sockData +i, imageSize - i, 0)) == -1)
{
fprintf(stderr, "Error receiving image %d\n", errno);
}
}
// deallocate
deallocateMemory(csock);
// Image Data Received, Now Reconstructing Image
printf("Image Data Received, Now Reconstructing\n");
int ptr = 0;
for (int i = 0; i < image.rows; i++)
{
for (int j = 0; j < image.cols; j++)
{
image.at<cv::Vec3b>(i,j) = cv::Vec3b(sockData[ptr+0],sockData[ptr+1],
sockData[ptr+2]);ptr = ptr + 3;
}
}
// Write produced output to stdout - Print
printf("Image Processed, now Displaying Results...\n");
displayResultsOnConsole(results);
// free(sockData);
return 0;
The error appears when I uncomment free(sockData);
Am I doing anything wrong?
You can only pass to free precisely the same pointer you got from malloc (or NULL, which does nothing). You break this rule, so bad things happen.
Uh. You allocate on stack and would want to deallocate on heap ?
You managed to add this big chunk allocated on stack ? -> no need to worry then.
Bad practice to allocate image bytes on stack because this is usually big. -> Allocate on heap, deallocate from heap.

Manipulating pixels of a cv::MAT just doesn't take effect

The following code is just supposed to load an image, fill it with a constant value and save it again.
Of course that doesn't have a purpose yet, but still it just doesn't work.
I can read the pixel values in the loop, but all changes are without effect and saves the file as it was loaded.
Think I followed the "efficient way" here accurately: http://docs.opencv.org/2.4/doc/tutorials/core/how_to_scan_images/how_to_scan_images.html
int main()
{
Mat im = imread("C:\\folder\\input.jpg");
int channels = im.channels();
int pixels = im.cols * channels;
if (!im.isContinuous())
{ return 0; } // Just to show that I've thought of that. It never exits here.
uchar* f = im.ptr<uchar>(0);
for (int i = 0; i < pixels; i++)
{
f[i] = (uchar)100;
}
imwrite("C:\\folder\\output.jpg", im);
return 0;
}
Normal cv functions like cvtColor() are taking effect as expected.
Are the changes through the array happening on a buffer somehow?
Huge thanks in advance!
The problem is that you are not looking at all pixels in the image. Your code only looks at im.cols*im.channels() which is a relatively small number as compared to the size of the image (im.cols*im.rows*im.channels()). When used in the for loop using the pointer, it only sets a value for couple of rows in an image ( if you look closely you will notice the saved image will have these set ).
Below is the corrected code:
int main()
{
Mat im = imread("C:\\folder\\input.jpg");
int channels = im.channels();
int pixels = im.cols * im.rows * channels;
if (!im.isContinuous())
{ return 0; } // Just to show that I've thought of that. It never exits here.
uchar* f = im.ptr<uchar>(0);
for (int i = 0; i < pixels; i++)
{
f[i] = (uchar)100;
}
imwrite("C:\\folder\\output.jpg", im);
return 0;
}

Qt wrongly thinks QImage is loaded properly

I am having an issue when trying to store a sequence of image data with Qt.
Here is a piece of code that shows the problem:
#include <vector>
#include <iostream>
#include <QImage>
...
const int nFrames = 1000;
std::vector<int> sizes(nFrames);
std::vector<uchar*> images(nFrames);
for (int k = 0; k < nFrames; k++)
{
QImage *img = new QImage("/.../sample.png");
uchar *data = img->bits();
sizes.at(k) = img->width() * img->height();
images.at(k) = data;
}
std::cout << "Data loaded \"successfully\"." << std::endl;
for (int k = 0; k < nFrames; k++)
{
std::cout << k << ": " << (int) (images.at(k)[0]) << std::endl;
}
In the first loop, the program loads QImage objects and puts the bitmaps in the images vector of pointers. In the second loop, we just read a pixel of each frame.
The problem is that the program proceeds through the first loop without complaining, even if the heap memory becomes full. As a result, I get a crash in the second loop, as shown by the output of the program:
Data loaded "successfully".
0: 128
1: 128
2: 128
...
192: 128
[crash before hitting 1000]
To reproduce the problem, you can use the grayscale image below, and you may need to change the value of nFrames, depending on how much memory you have.
My question is: How can I load the data in the first loop in a way that would allow me to detect if the memory becomes full? I don't necessarily need to keep the QImage objects in memory, but only the data of theimages vector.
Firs of all, the first loop has memory leak becouse of img objects are not deleted.
From Qt documentation:
uchar * QImage::bits()
Returns a pointer to the first pixel data. This
is equivalent to scanLine(0).
Note that QImage uses implicit data sharing. This function performs a
deep copy of the shared pixel data, thus ensuring that this QImage is
the only one using the current return value.
So you can safely delete img at and of loop.
....
images.at(k) = data;
delete img;
}
To detect if the memory becomes full you can check if operator new create QImage object like this:
QImage *img = new QImage("/.../sample.png");
if(!img) {
//out of memory
}
Partial answer:
The first loop can be replaced by the following:
for (int k = 0; k < nFrames; k++)
{
QImage *img = new QImage("/.../sample.png");
sizes.at(k) = img->width() * img->height();
uchar *data = new uchar[sizes.at(k)];
std::copy(img->bits(), img->bits() + sizes.at(k), data);
images.at(k) = data;
delete img;
}
This creates in images.at(k) a copy of the data that img->bits() points to. (Btw, this allows now to delete the QImage at the end of the first for loop.) An std::bad_alloc error in the loop if out of memory.
However, this is not good enough. I suspect possible issues when nFrames is set to a value such that the maximum memory taken by the program is close to the limit (or when another program frees memory while this is running). My concern is that I still have no guarantee that img.bits() returns a pointer to accurate data.

Array of Mats from video file - opencv

I'm coding using C++ and opencv on linux. I've found this similar question; although, I can't quite get it to work.
What I want to do is read in a video file and store a certain number of frames in an array. Over that number, I want to delete the first frame and add the most recent frame to the end of the array.
Here's my code so far.
VideoCapture cap("Video.mp4");
int width = 2;
int height = 2;
Rect roi = Rect(100, 100, width, height);
vector<Mat> matArray;
int numberFrames = 6;
int currentFrameNumber = 0;
for (;;){
cap >> cameraInput;
cameraInput(roi).copyTo(finalOutputImage);
if(currentFrameNumber < numberFrames){
matArray.push_back(finalOutputImage);
}else if(currentFrameNumber <= numberFrames){
for(int i=0;i<matArray.size()-1; i++){
swap(matArray[i], matArray[i+1]);
}
matArray.pop_back();
matArray.push_back(finalOutputImage);
}
currentFrameNumber++;
}
My understanding of mats says this is probably a problem with pointers; I'm just not sure how to fix it. When I look at the array of mats, every element is the same frame. Thank you.
There's no need for all this complication if you were to make use of C++'s highly useful STL.
if( currentFrameNumber >= numberFrames )
matArray.remove( matArray.begin() );
matArray.push_back( finalOutputImage.clone() ); //check out #berak's comment
should do it.

OpenCV Error: insufficient memory, in function call

I have a function looks like this:
void foo(){
Mat mat(50000, 200, CV_32FC1);
/* some manipulation using mat */
}
Then after several loops (in each loop, I call foo() once), it gives an error:
OpenCV Error: insufficient memory when allocating (about 1G) memory.
In my understanding, the Mat is local and once foo() returns, it is automatically de-allocated, so I am wondering why it leaks.
And it leaks on some data, but not all of them.
Here is my actual code:
bool VidBOW::readFeatPoints(int sidx, int eidx, cv::Mat &keys, cv::Mat &descs, cv::Mat &codes, int &barrier) {
// initialize buffers for keys and descriptors
int num = 50000; /// a large number
int nDims = 0; /// feature dimensions
if (featName == "STIP")
nDims = 162;
Mat descsBuff(num, nDims, CV_32FC1);
Mat keysBuff(num, 3, CV_32FC1);
Mat codesBuff(num, 3000, CV_64FC1);
// move overlapping codes from a previous window to buffer
int idxPre = -1;
int numPre = keys.rows;
int numMov = 0; /// number of overlapping points to move
for (int i = 0; i < numPre; ++i) {
if (keys.at<float>(i, 0) >= sidx) {
idxPre = i;
break;
}
}
if (idxPre > 0) {
numMov = numPre - idxPre;
keys.rowRange(idxPre, numPre).copyTo(keysBuff.rowRange(0, numMov));
codes.rowRange(idxPre, numPre).copyTo(codesBuff.rowRange(0, numMov));
}
// the starting row in code matrix where new codes from the updated features to add in
barrier = numMov;
// read keys and descriptors from feature file
int count = 0; /// number of new points that are read in buffers
if (featName == "STIP")
count = readSTIPFeatPoints(numMov, eidx, keysBuff, descsBuff);
// update keys, descriptors and codes matrix
descsBuff.rowRange(0, count).copyTo(descs);
keysBuff.rowRange(0, numMov+count).copyTo(keys);
codesBuff.rowRange(0, numMov+count).copyTo(codes);
// see if reaching the end of a feature file
bool flag = false;
if (feof(fpfeat))
flag = true;
return flag;
}
You don't post the code that calls your function, so I can't tell whether this is a true memory leak. The Mat objects that you allocate inside readFeatPoints() will be deallocated correctly, so there are no memory leaks that I can see.
You declare Mat codesBuff(num, 3000, CV_64FC1);. With num = 5000, this means you're trying to allocate 1.2 gigabytes of memory in one big block. You also copy some of this data to codes with the line:
codesBuff.rowRange(0, numMov+count).copyTo(codes);
If the value of numMove + count changes between iterations, this will cause reallocation of the data buffer in codes. If the value is large enough, you may also be eating up a significant amount of memory that persists across iterations of your loop. Both of these things may be leading to heap fragmentation. If at any point there doesn't exist a 1.2 GB chunk of memory waiting around, an insufficient memory error occurs, which is what you have experienced.