I was fiddling around trying to get Game Center Leaderboards to work. After becoming frustrated, I just removed the code that I added. However, I now am stuck with this error, even though I deleted it! It almost feels like I ruined a perfectly finished game.
Specifically, here is when the error occurred:
1) I am using cocos2d.
2) I copy and pasted GKAuthentication into my project, then I imported the .h into my first cocos2d scene.
3) I then got the sigABRT error, and decided to remove the import. However, the error persisted.
4) I then removed GKAuthentication files from the project completely. However, the error still persists.
+ (bool) makeContextCurrent:(ALCcontext*) context deviceReference:(ALCdevice*) deviceReference
{
#synchronized(self)
{
if(!alcMakeContextCurrent(context)) //sigABRT occurs here
{
if(nil != deviceReference)
{
CHECK_ALC_CALL(deviceReference);
}
else
{
OAL_LOG_ERROR(#"Could not make context %p current. Pass in a device reference for better diagnostic info.", context);
}
return NO;
}
}
return YES;
}
Strange thing: The simulator causes this error, but on the phone it works great.
It apparently has to do with the audio (?!). I have no idea what GKAuthentication has to do with this. Seems like a sudden error. Here is the error message from the console:
cocos2d: GL supports discard_framebuffer: YES
cocos2d: GL supports shareable VAO: NO
AudioStreamBasicDescription: 2 ch, 44100 Hz, 'lpcm' (0x00000029) 32-bit little-endian float, deinterleaved
2014-03-10 15:04:43.058 GaUi[40423:907] <com.apple.main-thread> Start: Mach message timeout. Apparently deadlocked. Aborting now.
(lldb)
I had the same issue with my Mac Book Pro and Mac Mini, rebooting the computer solves the problem.
Related
I have been developing a program for my Master's Thesis with OpenSceneGraph-3.4.0 and GUI from Qt 5.9 (otherwise in Visual Studio 2015 and 2017). At work everything works fine, but now that I have a new Computer at home I tried to get it running.
However, when I call the frame() method for the viewer, I get a Read Access Violation in QtThread.cpp at the setProcessorAffinity(unsigned int cpunum), specifically in the following line:
QtThreadPrivateData* pd = static_cast<QtThreadPrivateData*>(_prvData);
Here is the complete function (QtThread.cpp is part of OpenThreads of OSG):
// Description: set processor affinity for the thread
//
// Use: public
//
int Thread::setProcessorAffinity(unsigned int cpunum)
{
QtThreadPrivateData* pd = static_cast<QtThreadPrivateData*>(_prvData);
pd->cpunum = cpunum;
if (!pd->isRunning) return 0;
// FIXME:
// Qt doesn't have a platform-independent thread affinity method at present.
// Does it automatically configure threads on different processors, or we have to do it ourselves?
return -1;
}
The viewer in OSG is set to osgViewer::Viewer::SingleThreaded, but if I remove that line I get an error "Cannot make QOpenGLContext current in a different thread" in GraphicsWindowQt.cpp(which is part of OsgQt), so that's probably a dead end.
Edit for clarification
I call frame()on the osgViewer::Viewer object.
In this function, the viewer calls realize() (which is a function of the Viewer class).
In there setUpThreading()is called (which is a function of the Viewer Base class).
This in turn calls OpenThreads::SetProcessorAffinityOfCurrentThread(0)
In there, the following code is executed:
Thread* thread = Thread::CurrentThread();
if (thread)
return thread->setProcessorAffinity(cpunum);
thread (after the first line) has a value 0x00000000fdfdfdfd which looks like an error to me.
In any case, the last call is the one I posted in my original question.
I don't even have an idea of where to start fixing this. I assume, it's some processor related problem. My processor is a Ryzen 7 1700 (at work it's an Intel i7 3770k), so maybe that helps.
Otherwise, at home I'm using Windows 10, wheras at work it's Windows 7.
I'd be thankful for any help at all.
So in the end, it seems to be a problem with OpenThreads (and thus the OpenSceneGraph part, which I can do nothing about). When using cmake for the OpenSceneGraph source, there is an option "BUILD_OPENTHREADS_WITH_QT" that needs to be disabled.
I found the solution in this thread in the OSG forum, so thanks to this guy.
It's just plain luck my program is so simple, so I eventually found out what causes the mysterious log message. My program log looks like this:
Debugging starts
failed to start
Debugging has finished
Which happens after:
camera = new QCamera(QCameraInfo::defaultCamera());
// see http://omg-it.works/how-to-grab-video-frames-directly-from-qcamera/
camera->setViewfinder(frameGrabber = new CameraFrameGrabber());
camera->start();
The start() method causes this message in console. Now the meaning of the message is obvious, what it's not very helpful. What steps should I take to troubleshoot it?
Reasons for this might differ, but in my case it was simply because I provided invalid QCameraInfo. The culprit is that QCameraInfo::defaultCamera() might return invalid value if Qt fails to detect any cameras on your system, which unfortunately happens even if cameras are present.
Sorry, for an old question but I nothing found helful for me. I'm developing an iOS app using OpenCV 3.0 framework.
I'm using cvVideoCamera delgate to record video but as I set _cvVideoCam.recordVideo = YES;, it always gives me memory warning error.
If I set cvVideoCam,recordVideo = YES;, Then there is not any memory warning but also the output url always shows (null) at location, on recording finish.
Thanks in advance.
please check your
-(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer.....
for -
withPresentationTime:lastSampleTime] )
{
NSLog(#"Video Writing Error");
}
if (pixelBuffer != nullptr)
CVPixelBufferRelease(pixelBuffer);
}
}
because, in some of earlier version OpenCV forgot to release memory upon each append operation to MPEG4 output. May be its your need.
I'm using Qt 4.8 with Qt Creator 2.4.1 on Windows 7 Ultimate x64.
I'm taking audio input using QAudioInput class and playing it using QAudioOutput. There is a 2 seconds timeout after which I stop taking input and then setup the output as follows:
class MainWindow
{
// ...
QByteArray output_data;
QBuffer output_data_buffer;
QAudioOutput *audio_out;
// ...
};
MainWindow::MainWindow(QWidget *parent)
{
// ...
output_data_buffer.setBuffer(&output_data);
// ...
}
void MainWindow::audioInputStopped(QByteArray data)
{
output_data = data;
output_data_buffer.open(QIODevice::ReadOnly);
audio_out = new QAudioOutput(audio_format, this);
connect(audio_out, SIGNAL(stateChanged(QAudio::State)),
SLOT(audioOutputStateChanged(QAudio::State)));
audio_out->start(&output_data_buffer);
}
The audio format I'm using is supported by both input and output devices. I checked them using QAudioDeviceInfo::isFormatSupported(). The 2 seconds audio (data in audioInputStopped()) always plays fine.
In the slot audioOutputStateChanged, I'm always encountering QAudio::UnderrunError error from audio_out->error() after the buffer is finished playing. After audio_out->start() is called, the state (passed as parameter in audioOutputStateChanged()) and error goes as follows:
No error. Active state.
No error. Stopped state.
Underrun error. Idle state.
Note that I'm stopping audio_out in idle state following this example. Why the code is encountering underrun error? Is this normal?
This may seem kind of odd, but I've seen where the built-in arrays in Qt handle better when constructed on the heap, or at least when their elements are constructed on the heap (so they are just an array of pointers). The memory management is a little bit trickier, but items pushed into them don't go out of scope. The Qt Object Model also promotes putting most things on the heap and parenting them correctly. This might help.
After reading a little bit up on buffer underruns, it sounds like there is something still trying to read from the audio source while something else is writing to it or vice-versa. Check out some of the links below. You could try disconnecting the audio_in part from the buffer before reading the buffer. This is more likely to fix the error.
I would also construct your QAudioOutput pointer in the constructor for your main window (more as a style thing). Following some of how it is organized in the examples in Qt, it seems like a better organization. Here is the cpp for the QAudioInput example.
If you had a more complete example, I could try more with it to recreate the error and debug it.
Here is someone else to commiserate with:
http://qt-project.org/forums/viewthread/16729
And a wiki article:
http://en.wikipedia.org/wiki/Buffer_underrun
And the list of Multimedia examples on Qt:
http://doc.qt.nokia.com/4.7-snapshot/examples-multimedia.html
Hope that helps.
I am using OpenCV 1 to do some image processing, and am confused about the cvSetErrMode function (which is part of CxCore).
OpenCV has three error modes.
Leaf: The program is terminated after the error handler is called.
Parent: The program is not terminated, but the error handler is called.
Silent: Similar to Parent mode, but no error handler is called
At the start of my code, I call cvSetErrMode(CV_ErrModeParent) to switch from the default 'leaf' mode to 'parent' mode so my application is not terminated with an exception/assertion pop up.
Unfortunately 'parent' mode doesn't seem to be working. I still get the message dialog pop up, and my application still terminates.
If I call cvSetErrMode(CV_ErrModeSilent) then it actually goes silent, and no longer quits the application or throws up a dialog... but this also means that I dont know that an error has occurred. In this case, I think the mode is being set correctly.
Has anyone else seem this behaviour before and might be able to recommend a solution?
References:
cvSetErrMode function reference
Open CV Error handling mode reference
I am going to answer my own question, because after some fiddling around I have worked out what happens.
When you switch to 'parent' mode instead of leaf mode, there is an error handler that gets called cvGuiBoxReport(). cvGuiBoxReport() is the default error handler. It seems that even in parent mode, cvGuiBoxReport() still terminates your application! Oops.
So, to get around that you can write your own error handler, and redirect the error to be handled and NOT terminate the application.
An example error handler:
int MyErrorHandler(int status, const char* func_name, const char* err_msg, const char* file_name, int line, void*)
{
std::cerr << "Woohoo, my own custom error handler" << std::endl;
return 0;
}
You can set up parent mode and redirect your error with:
cvSetErrMode(CV_ErrModeParent);
cvRedirectError(MyErrorHandler);
In a week of servers crashing from uploading corrupt or empty images to our image processing server, here are some thoughts on how I solved the intricacies of OpenCV's error handling. We are using V2.2 in a C++ server.
The problem arises in cv::imread() and cv::imdecode() when the image to be loaded is corrupt (or empty). Normally OpenCV just exits the process with with some error messages, not a good idea when you're running a server which should work all the time.
Reviewing the source code at https://code.ros.org/trac/opencv/browser/trunk/opencv/modules/core/include/opencv2/core/core.hpp I ignored the hint in the source comments for cv::setBreakOnError() and discovered the that following pattern works:
cv::setBreakOnError(true); // Can be set globally
...
...
cv::Mat srcImage = cv::imread(filename, 1);
if (!srcImage.data) throw std::exception("bad image");
cv::imread() will now not exit the process, but passes control to your own exception handling, so you can do with it what you like.
Finding this has saved a lot of heartbreak.