I am working on a small OpenGL project using the GLFW library. Everything was fine until from one day, I can't get it to render anything except the background. So I loaded up an older version, which I am sure worked just fine and the same thing happens. I then went back in time and every version I tried did the same thing.
You can change the glClearColor() and this color is used, but thats the only thing you see. So I tried to strip the project down until I ended with a very basic program (a hard-coded colored cube) which still doesn't render anything.
I think the possible causes may have to do something with driver updates or me downloading newer versions of libraries, but I don't think that could have such a massive effect, given that the code worked just fine previously.
I am running 64-bit Windows 7 on GIGABYTE B75M-D3H motherboard with Intel Core i5-3350P CPU and Radeon HD 7750 graphics(Currently Catalyst 14.12, have had some updates since the problem first appeared though). I use mingw-w64 as my compiler of choice with posix threads and sjlj exceptions. I tried it on a different machine (Dell Inspiron with Windows 8.1) and got the same results.
I am using GLFW, GLEW and GLM.
The original project: https://github.com/GenaBitu/OpenStrategia
A stripped-down version: https://gist.github.com/GenaBitu/852dc4c4db6d72c945d1 (Quite messy, still not working)
Can a driver update cause this? Am I stupid and forgot something which suddenly broke the whole program?
Turns out I was using the Core OpenGL profile, which requires you to use Vertex Array Objects, which I didn't. Up until ~February, the graphics didn't mind, but after a certain driver update, it refused to render the object (Which I believe is the correct behaviour).
Related
In my settings, I clearly have my screen setup to 200% like so:
When I worked on my Qt application earlier, it would properly follow that setup.
Somehow, I changed a something and now it always appears at 100%. That makes it difficult to read the text.
What option(s) would Qt have that would turn that feature off?
I have other Qt applications that still work as expected, so I'm really thinking that's something I did. Maybe a widget I added? Or a call I make? I just have no idea what it could be so looking at my changes doesn't help at the moment.
One thing I added recently is a QSvgWidget, but even if I remove it, it still doesn't work. Another thing I've notice is that the OS (Ubuntu 18.04) updated the desktop themes. But I don't think that happened at that time.
UPDATE:
This seems to be generalized. I just upgraded my OS. The VirtualBox snap was updated and now the window also appears small (i.e. ignores the High DPI setup). So it must be an OS thing (i.e. a library that was updated and it breaks the Qt High DPI feature).
Okay! I got the answer to that one in my situation.
Whenever nvidia sends a new version of their driver, my X11 session continues to work, but OpenGL is not accessible anymore. There must be something in Qt that decides to use OpenGL and since it (silently) fails opening a connection, it falls back to "no DPI scaling capability".
After a reboot, everything is back to normal and zooms in and out as expected in my app and also the VirtualBox window.
I can't seem to be able to find the process address of these functions on my system. I'm using GLEW 1.9 which has support for everything. I am loading a 4.3 core profile for my context... My nVidia drivers are fully up to date. I downloaded a program called GPU Caps and it shows the extension as available. Any ideas?
Update - I had to enable glewExperimental to get it to work. I thought program separation was core since 4.1. If there are no insights I will mark this as solved.
I had to enable glewExperimental to get it to work.
Of course you did. GLEW is broken when it comes to loading OpenGL core contexts. You have to turn on that switch to make it work. It's a well-known workaround.
Or you could just use an OpenGL loading system that isn't broken for core (which would be pretty much anything besides GLEW that is still in active development). FYI: I wrote a couple of those.
I started using glslDevil to debug my OpenGL engine shaders. When I start the program from inside glslDevil all I get in GLTrace windows is the calls to:
wglGetPixelFormatAttribivARB()
which is printed like 1002 times and after that the debugged app freezes and that is it. No debug info or anything else. Maybe glslDevil doesn't support newer OpenGL versions?
I am using OpenGL 4.2 compatibility mode (but fully programmable pipeline), running Win7 64 bit. The tested software is 32 bit.
wglGetPixelFormatAttribivARB is a function of the Window OpenGL subsystem that retrieves a particular Pixel Format configuration. There can be a lot of different configurations available (several hundreds to thousands) and those 1002 calls to it are most likely caused by a loop that enumerates all of them to find a best match.
Other than giving you that information I can't help you much, though.
I have just recently installed Windows 8, and I tried to compile and build a simple c++ game project in VS 2010, but when I did, it was running at 5 fps. On windows 7, it runs at a solid 60 fps. Nothing has been changed in the code, but there is just horrible slow down.
I have updated my video drivers, but there is still horrible lag. I thought the problem was to do with compatibility issues with windows 8 and OpenGL, but I can't find anything to confirm this. I was wondering if anyone else has had this problem, and if you have solved it.
I would recommend you test your graphics card / drivers first. All sorts of driver issues could arise when you upgrade operating systems. One of the best tests would be to download Cinebench and see how it performs. Cinebench will evaluate your OpenGL performance. If you get poor results, then you know it's a hardware / driver issue and not an issue with your application.
If the Cinebench results are good, then you can move on to the recommendations made by #Robert Rouhani (comments).
http://www.maxon.net/products/cinebench/overview.html
What sort of video card do you have in the Win8 machine?
If it's a laptop you might be battling against nVidia Optimus (or an equivalent technology?). Basically programs have to tell the OS in advance that they want to use the video card or they get defaulted to using the low power GPU embedded in the CPU (note: over-simplification).
If this is the case, there's some options in the nVidia control panel to let you create a profile telling the OS to run your app with the discrete GPU, rather than the embedded one.
DISCLAIMER:
I see that some suggestions for the exact same question come up, however that (similar) post was migrated to SuperUsers and seems to have been removed. I however would still like to post my question here because I consider it software/programming related enough not to post on SuperUsers (the line is vague sometimes between what is a software and what is a hardware issue).
I am running a very simple OpenGL program in Code::Blocks in VirtualBox with Ubuntu 11.10 installed on a SSD. Whenever I build&run a program I get these errors:
OpenGL Warning: XGetVisualInfo returned 0 visuals for 0x232dbe0
OpenGL Warning: Retry with 0x802 returned 0 visuals
Segmentation fault
From what I have gathered myself so far this is VirtualBox related. I need to set
LIBGL_ALWAYS_INDIRECT=1
In other words, enabling indirect rendering via X.org rather then communicating directly with the hardware. This issue is probably not related to the fact that I have an ATI card as I have a laptop with an ATI card that runs the same program flawlessly.
Still, I don't dare to say that the fact that my GPU is an ATI doesn't play any role at all. Nor am I sure if the drivers are correctly installed (it says under System info -> Graphics -> graphics driver: Chromium.)
Any help on HOW to set LIBGL_ALWAYS_INDIRECT=1 would be greatly appreciated. I simply lack the knowledge of where to put this command or where/how to execute it in the terminal.
Sources:
https://forums.virtualbox.org/viewtopic.php?f=3&t=30964
https://www.virtualbox.org/ticket/6848
EDIT: in the terminal type:
export LIBGL_ALWAYS_INDIRECT = 1
To verfiy that direct rendering is off:
glxinfo | grep direct
However, the problem persists. I still get mentioned OpenGL warnings and the segmentation fault.
I ran into this same problem running the Bullet Physics OpenGL demos on Ubuntu 12.04 inside VirtualBox. Rather than using indirect rendering, I was able to solve the problem by modifying the glut window creation code in my source as described here: https://groups.google.com/forum/?fromgroups=#!topic/comp.graphics.api.opengl/Oecgo2Fc9Zc.
This entailed replacing the original
...
glutCreateWindow(title);
...
with
...
if (!glutGet(GLUT_DISPLAY_MODE_POSSIBLE))
{
exit(1);
}
glutCreateWindow(title);
...
as described in the link. It's not clear to me why this should correct the segfault issue; apparently glutGet has some side effects beyond retrieving state values. It could be a quirk of freeglut's implementation of glut.
If you look at the /etc/environment file, you can see a couple of variables exposed there - this will give you and idea of how to expose that environment variable across the entire system. You could also try putting it in either ~/.profile or ~/.bash_profile depending on your needs.
The real question in my mind is: Did you install the guest additions for Ubuntu? You shouldn't need to install any ATI drivers in your guest as VirtualBox won't expose the actual physical graphics hardware to your VM. You can configure your guest to support 3D acceleration in the virtual machine settings (make sure you turn off the VM first) under the Display section. You will probably want to boost the allocated virtual memory - 64MB or 128MB should be plenty depending on your needs.