OpenGL: In VMware (debian x64), glxgears works - but not other OpenGL programs - opengl

After installing debian in VMware and installing all librarys required to run OpenGL applications with freeglut, I used glxgears to make sure everything works fine.
# glxgears
3426 frames in 5.0 seconds = 685.171 FPS
3562 frames in 5.0 seconds = 712.339 FPS
...
XI0: fatal IO error 11 (Resource temporarily unavailable) on X server ":0.0"
after 33172 requests (33170 known processed) with 0 events remaining.
glxgears seems to be running fine. It displays the gears roatating in a window even though the above error appears after clicking on "Close" on the glxgears window frame.
But when I attempt to execute a simple OpenGL program using freeglut3, I get the following result:
# ./program
X Error of failed request: BadRequest (invalid request code or no such operation)
Major opcode of failed request: 155 (GLX)
Minor opcode of failed request: 34 ()
Serial number of failed request: 36
Current serial number in output stream: 35
A program which works using all the same files and libraries on a non-virtual machine, not available for me at home.
How can I resolve this issue? Is this a common problem with running OpenGL programs on virtual machines?

This is a cryptic way of your GLX server telling you: it has never heard of glXCreateContextAttribsARB (GLX OpCode 34). In other words, your system does not support GLX_ARB_create_context. The best way to explain why this is, would be to run something like glxinfo -v and add the output to your question.
It is possible to create a working context without this extension on your system as glxgears is clearly demonstrating. I have to imagine that freeglut3 should be smart enough not to use the extension if you do not request anything fancy from it (e.g. do not ask for a core profile context or a specific major/minor version). If it is not, then you will have to find a more sophisticated GLX implementation or use a different framework.

Related

freeglut (something): failed to open display ''

I compiled a C++ code under Linux (Ubuntu) and everything is fine as far as I connect a monitor to my PC.
My code shows some graphics and then it saves their screenshots. The runtime graphic is not important to me but the screenshots.
But if I run the code remotely, I face with the following runtime error:
freeglut (something): failed to open display ''
If I forward x (ssh -v -X) everything would be find. But what if I don't do that?!
How to get around it? I don't care if anything is displayed or not.
Is it possible to define a temporary virtual screen on the remote computer or get around this problem in any other way? I just need the screenshot files.
I suggest you try XVFD as your X server on the remote machine
Quote form this answer: Does using Xvfb to run OpenGL effects version?
Xvfb is an X server which whole purpose is to provide X11 services without having dedicated graphics hardware
This allows you to have both a GL context and a window without using a GPU

X Error of failed request: GLXBadFBConfig (opengl 4.3 - ubuntu)

I'm reading the last version of the OpenGL Programming guide and it is updated for OpenGL 4.3.
The first code they go through is a really simple code to make 2 triangles and of course it is the code I use to test OpenGL on my latop (running kubuntu).
The code runs but this is what happens :
X Error of failed request: GLXBadFBConfig
Major opcode of failed request: 153 (GLX)
Minor opcode of failed request: 34 ()
Serial number of failed request: 34
Current serial number in output stream: 33
RUN FINISHED; exit value 1; real time: 200ms; user: 0ms; system: 0ms
I saw that can happend if you don't have a graphic card that can handel the version of OpenGL you are using.
But on my laptop I have a NVidia 555m so according to the nvidia website I'm good on that side but since I run ubuntu and NVidia are not really good with their drivers I'm sure not that my NVidia-current with bumblebee works for OpenGL 4.3.
How can I check the version supported by my setup ?
Is there anyway for me to make it work or do I need to install Windows :/ ?
glxinfo is your friend. It's a command line tool which will report the version numbers and extensions supported for server side GLX, client side GLX, and OpenGL itself.
Do you have the NVIDIA binary (proprietary) driver installed? You'll need it if you want to take advantage of OpenGL versions 3 or 4. Like every software product there are occasional glitches, but over the years I think most 3D programmers / users would agree that the NVIDIA drivers for Linux have been very solid, much better than the alternatives.

OpenGL glslDevil fails debugging

I started using glslDevil to debug my OpenGL engine shaders. When I start the program from inside glslDevil all I get in GLTrace windows is the calls to:
wglGetPixelFormatAttribivARB()
which is printed like 1002 times and after that the debugged app freezes and that is it. No debug info or anything else. Maybe glslDevil doesn't support newer OpenGL versions?
I am using OpenGL 4.2 compatibility mode (but fully programmable pipeline), running Win7 64 bit. The tested software is 32 bit.
wglGetPixelFormatAttribivARB is a function of the Window OpenGL subsystem that retrieves a particular Pixel Format configuration. There can be a lot of different configurations available (several hundreds to thousands) and those 1002 calls to it are most likely caused by a loop that enumerates all of them to find a best match.
Other than giving you that information I can't help you much, though.

OpenGL Tutorial Error

I managed to build the tutorials from here but when executing them an error occurs:
X Error of failed request: BadRequest (invalid request code or no such operation)
Major opcode of failed request: 154 (GLX)
Minor opcode of failed request: 34 ()
Serial number of failed request: 34
Current serial number in output stream: 33
Google told me that this error is related to the graphics driver somehow, so maybe information about my system is useful; I am using Ubuntu 12.04 64 bit on a Samsung 700Z7C notebook.
You have:
OpenGL version string: 2.1 Mesa 8.0.4
Third page, "What You Need":
...but to execute the code, you must have a programming environment that allows OpenGL. Specifically, you will need hardware capable of running OpenGL version 3.3. ...
Samsung 700Z7C notebook:
Graphics: External or Integrated: External (Optimus)
Optimus:
When no software mechanism exists for switching between graphics adapters, the system cannot use the NVIDIA GPU at all, even if an installed graphics driver would support it.
Make sure you're routing your GLX requests to the NVidia chip (with the proprietary drivers) and not the Intel one.
There are 3 things you should check - does your graphics card support OpenGL 3.3 (I believe that's the version used in the arcsynthesis book), are your drivers up to date, and is your code actually ok? We can probably cross 3 out, it would still help if you told us which example you were trying to run though.

OpenGL GLUT on VirtualBox Ubuntu 11.10 segmentation fault

DISCLAIMER:
I see that some suggestions for the exact same question come up, however that (similar) post was migrated to SuperUsers and seems to have been removed. I however would still like to post my question here because I consider it software/programming related enough not to post on SuperUsers (the line is vague sometimes between what is a software and what is a hardware issue).
I am running a very simple OpenGL program in Code::Blocks in VirtualBox with Ubuntu 11.10 installed on a SSD. Whenever I build&run a program I get these errors:
OpenGL Warning: XGetVisualInfo returned 0 visuals for 0x232dbe0
OpenGL Warning: Retry with 0x802 returned 0 visuals
Segmentation fault
From what I have gathered myself so far this is VirtualBox related. I need to set
LIBGL_ALWAYS_INDIRECT=1
In other words, enabling indirect rendering via X.org rather then communicating directly with the hardware. This issue is probably not related to the fact that I have an ATI card as I have a laptop with an ATI card that runs the same program flawlessly.
Still, I don't dare to say that the fact that my GPU is an ATI doesn't play any role at all. Nor am I sure if the drivers are correctly installed (it says under System info -> Graphics -> graphics driver: Chromium.)
Any help on HOW to set LIBGL_ALWAYS_INDIRECT=1 would be greatly appreciated. I simply lack the knowledge of where to put this command or where/how to execute it in the terminal.
Sources:
https://forums.virtualbox.org/viewtopic.php?f=3&t=30964
https://www.virtualbox.org/ticket/6848
EDIT: in the terminal type:
export LIBGL_ALWAYS_INDIRECT = 1
To verfiy that direct rendering is off:
glxinfo | grep direct
However, the problem persists. I still get mentioned OpenGL warnings and the segmentation fault.
I ran into this same problem running the Bullet Physics OpenGL demos on Ubuntu 12.04 inside VirtualBox. Rather than using indirect rendering, I was able to solve the problem by modifying the glut window creation code in my source as described here: https://groups.google.com/forum/?fromgroups=#!topic/comp.graphics.api.opengl/Oecgo2Fc9Zc.
This entailed replacing the original
...
glutCreateWindow(title);
...
with
...
if (!glutGet(GLUT_DISPLAY_MODE_POSSIBLE))
{
exit(1);
}
glutCreateWindow(title);
...
as described in the link. It's not clear to me why this should correct the segfault issue; apparently glutGet has some side effects beyond retrieving state values. It could be a quirk of freeglut's implementation of glut.
If you look at the /etc/environment file, you can see a couple of variables exposed there - this will give you and idea of how to expose that environment variable across the entire system. You could also try putting it in either ~/.profile or ~/.bash_profile depending on your needs.
The real question in my mind is: Did you install the guest additions for Ubuntu? You shouldn't need to install any ATI drivers in your guest as VirtualBox won't expose the actual physical graphics hardware to your VM. You can configure your guest to support 3D acceleration in the virtual machine settings (make sure you turn off the VM first) under the Display section. You will probably want to boost the allocated virtual memory - 64MB or 128MB should be plenty depending on your needs.