Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Closed 9 years ago.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Questions concerning problems with code you've written must describe the specific problem — and include valid code to reproduce it — in the question itself. See SSCCE.org for guidance.
Improve this question
I have a shader that is currently doing some raytracing. The shader used to take the scene information as uniforms to render the scene but this proved to be way too limited so we switched to using SSBOs (shader storage buffer objects). The code works perfectly on two computers, but another computer is rendering it very slowly. The video card for that computer is a radeon HD 6950. The video cards that are rendering it correctly are a GTX 570 and a radeon HD 7970. The scene is shown correctly on the three computers but the radeon HD 6950 is rendering it very slowly (1 FPS when we are rotating around the scene). We thought it was a openGL version problem but it doesn't seem to be the case since we updated the drivers and it still doesn't work. Any idea where the problem might be?
There are a few possibilities:
You could be falling off the fast path on that particular card. Some aspect of your rendering may not be implemented as efficiently on the lower-end card, for example.
You may be hitting the VRAM limit on the 6950, but not on the other 2 cards and OpenGL is essentially thrashing, swapping things out to main memory and back
You may have triggered software rendering on that card. There may be some specific OpenGL feature you're using that's only implemented in software for the 6950, but is hardware accelerated on the other cards.
You don't say which OS you're working with, so I'm not sure what to tell you about debugging the problem. On MacOS you can use OpenGL Profiler to see if it's falling back to software and use OpenGL Driver Monitor to see if it's paging out. On iOS you can use Xcode's OpenGL profiling instrument for both of those. I'm not sure on Windows or Linux as I don't have experience with them.
Related
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 8 years ago.
Improve this question
I've an application using OpenGL to draw some content in a window (based on the example from here). Everything looks good when application running on my desktop PC with NVIDIA GeForce GTS 450 graphic card, but it looks terrible on other computers (I've tried on 2 notebooks with intgrated Intel graphic cards, and on the virtual machine running on VirtualBox).
The effect looks like this:
I've tried to manipulate gluPerspective function but without results.
What may be the reason for such effect?
EDIT:
Correct files from computer with NVIDIA attached:
I't looks like some kind of z-fighting. That is you have several geometries with overlapping or very similar Z values, and precision errors in the calculations of the Z values make that some parts of your objects are hidden.
The fact that it works in some machines and not in others may be caused by the Z-buffer being 32-bits or 16-bits, depending on the machine.
Note that even if forcing your Z-buffer to 32-bits may solve the issue, you should consider fixing the Z-values of your objects. There are (used to be?) some hardware out there that doesn't support 32-bit Z-buffers.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 8 years ago.
Improve this question
I need some unbiased views from experts. I bought BobCAD a couple months ago. It did run fine while evaluating and also after installation. Now, after some use it starts crashing with multiple "null Pointer" exceptions on closing the simulation mode.
Tech support is telling me that it is the graphic card that behaves (I quote:) "unpredictable". They say an integrated graphic card is only good for word and internet browsing.
However BobCad once run fine, I can perfectly play games, use CAD or other applications on my computer without crashing it. This leads me to having a hard time to believe this. BobCad does not use a lot of resources contrary to what they claim. There is no lagging or signs of useng my computer at the limit of what it is capable of.
From what I know you do not program the graphic card directly anymore - and certainly not in a CAM application, so those problems with graphic cards should be gone.
From what I see BobCad is a WPF application presumably written in C++
Please tell me, are they right? Is my suspicion of them not being very competent wrong?
Help me out with your experiences.
Best Regards
Leo
A expensive dedicated graphic card is usually better than an integrated,
but that doesn´t means that integrated ones can´t do any real work.
Gc´s are directly programmed, even today (usage is even rising).
But, probably not in a WPF application...
Anyway, that all is no excuse for Nullpointerexceptions delivered to the user.
That´s simply a programming error, doesnt´t matter what your Gc is capable of.
If the program says "The Gc is too weak" it´s one thing, but crashing is inacceptable.
(And, incomptent support people are nothing unusual, sadly.)
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 9 years ago.
Improve this question
I've recently started exploring the guts of VirtualBox's Guest Extensions on my Ubuntu Guest. Mostly from curiosity. Partly due to "OpenGL Warning: ... not found in mesa table" warnings. I noticed they are using Chromium OpenGL implementation. I have a two part question.
1.How do I get rid of those warnings? Are they indications of a larger problem? I'm noticing repaint issues which lead me down this path.
2.Am I missing something are is this a 12 year old project last touched 6 years ago!? Is it being actively developed some where else? Will it support OpenGL 3?
Online references would be appreciated as I'm having a hard time finding anything other than these below.
http://sourceforge.net/p/chromium/discussion/stats
http://chromium.sourceforge.net/doc/index.html
The chromium project is basically dead since 2008 or so. There is no support for GL3.x, and it is not planned. Actually, implementing the main purpose of chromium (application-transparent distributed rendering by manipulating the GL command stream) is incredibly hard to outright impossible with the programmable pipeline and modern GL features.
I'm not really familiar with virtualbox, but I am aware that they just used parts of the chromium project to implement a hw-accelerated guest GL simply by forwarding the GL command stream to the host. Such a task is much easier to adapt to modern GL, as no real stream manipulation is to be done. But I'm not aware of how far they have come on that path. So consider this only half of an answer to your question.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 2 years ago.
The community reviewed whether to reopen this question 8 months ago and left it closed:
Original close reason(s) were not resolved
Improve this question
Is is possible to do Open GL development and run programs on a computer with out a graphics card? (e.g. my netbook running Ubuntu)
Update This was many years ago, the link is not available anymore, and there are probably newer, better, builds now.
Yes, you can use MESA.
For your convenience, I've compiled it in both 32- and 64bit at:
http://dl.dropbox.com/u/9496269/mesa.zip
Simply put them where your executable file is located.
Sure. Many software only implementations of OpenGL exist. Check out the Mesa project at http://www.mesa3d.org/ for one of the most popular. There are parts of the shading language not fully supported, and it tends to lag the standard a bit in general, but that is the case of all software API emulators. Its still very full featured and can be used in production code for many common uses.
You can use OpenGL on many integrated GPUs, mostly AMD chips like the Ryzen 3 3200G, which has a GPU that is the same as a GTX 1050 for around £100.
Closed. This question needs debugging details. It is not currently accepting answers.
Edit the question to include desired behavior, a specific problem or error, and the shortest code necessary to reproduce the problem. This will help others answer the question.
Closed 3 years ago.
Improve this question
I'm struggling with a tearing problem in my OpenGL application.
I cant seem to find a driver for the GMA 4500HD (in my case running on a Thinkpad x200s) that supports the opengl extension WGL_EXT_swap_control.
Currently I have the 8.15.10.2182 driver installed, which I think is the latest.
I have set the "Vertical sync" parameter in the driver control Window, but it seem to do nothing.
Do I have to live with the tearing problem, or is there anything I can do so that the buffer swap occurs on vsync without the WGL_EXT_swap_control extension ?
Edit: I noticed that a demo application using Direct3d (11) do not suffer from tearing on the same type of hardware.
Is there a setting to enable VSync in the driver control panel?
Often you have to enable features there before opengl can see them.
Support of WGL_EXT_swap_control is there since the dawn of time.
If you have any problem it can only be either due to you doing something wrong, or a driver bug (but this would seem kind of strange then, considering people on the net were complaining about the actual opposite if any). Check if the control panel isn't forcing anything in this regard, and if you are calling the actually right function?