I'm working on a 3D game project with a bunch of people. The project runs fine on all of their machines but mine. On my computer the skybox texture intermittently disappears and the rendering goes awfully bad.
We've all been working on Windows XP with Visual Studio 2008, and the only significant difference between my machine and my co-workers is that my computer has a Nvidia 9400 GT graphics card, which, I guess, is the thing to blame.
Here's a screenshot, skybox-less. Is there any setting in OpenGL or the Nvidia video manager that I can tweak to avoid this?
Sometimes this can depend on the texture size. Could tyou try with a texture in which dimensions are power of two ?
Related
I am trying to create a 3D simulation, the code is here and the vertex shader is here, as you can imagine the pixel shader is just a straight forward function which passes the input straight to output, please forgive the lack of comments and sanity, I'm new to D3D.
Well basically the problem is that it runs fine on my PC, but on my laptop all I get is "Access is Denied". Which is a problem since I need to make the sim portable. My laptop has DirectX 12 installed and has directdraw and direct3d acceleration enabled. It's an AMD Readeon Graphics Processor chip type and its name is AMD Radeon vega 3 graphics. The laptop is brand new so I don't expect that I need to update drivers (please correct me if I am wrong).
enter code here
In addition in its early stages, the program managed to run a cube, but since I made a very rudimental object importer, that's where the problem started..
Yesterday my OpenGL Classes have started, and me and my classmates downloaded a project made by my teacher as a sample project to learn from, on their pc it worked perfectly but all I'm seeing is white, the objects are white and the ground is white, on other classmates their pc's the objects are rendered correctly.
Does anyone know why it won't render the textures on my pc?
I've tried a few things such as, turning my Intel Graphics card off and running it on my Nvidia card, and i've tried to turn my Nvidia card off to run it on my Intel but neither worked.
My Nvidia card is a Nvidia Geforce 710M which supports OpenGL 4.5, and I have OpenGL version 4.5.
I'm not getting any errors in the code.
I'am able to see shadows I've just tried a different project and only saw shadows, the rest was all white, as it was with the first project.
I've re-installed the required libraries again, then rebooted my pc and it worked :D thank you for all your help.
I'm creating a simple computer games framework using SDL, and am still deciding between using SDL's software renderer (Which is also much easier to use than OpenGL), or the supposedly faster OpenGL, despite the fact that in Visual Studio 2008 I'm having troubles linking the OpenGL libraries. Any suggestions for which graphical interface to choose?
The answer is .. depends.
You should ask yourself :
What is the number of polygons you will render per frame ?
What is the rendering rate (number of frames per second) that you wish to get ?
Do you expect the number of polygons to render increase in future ?
About Visual studio and linking problems, please find a sample project here that uses OpenGL. OpenGL Example.
I am using the open source haptics and 3D graphics library Chai3D running on Windows 7. I have rewritten the library to do stereoscopic 3D with Nvidia nvision. I am using OpenGL with GLUT, and using glutInitDisplayMode(GLUT_RGB | GLUT_DEPTH | GLUT_DOUBLE | GLUT_STEREO) to initialize the display mode. It works great on Quadro cards, but on GTX 560m and GTX 580 cards it says the pixel format is unsupported. I know the monitors are capable of displaying the 3D, and I know the cards are capable of rendering it. I have tried adjusting the resolution of the screen and everything else I can think of, but nothing seems to work. I have read in various places that stereoscopic 3D with OpenGL only works in fullscreen mode. So, the only possible reason for this error I can think of is that I am starting in windowed mode. How would I force the application to start in fullscreen mode with 3D enabled? Can anyone provide a code example of quad buffer stereoscopic 3D using OpenGL that works on the later GTX model cards?
What you experience has no technical reasons, but is simply product policy of NVidia. Quadbuffer stereo is considered a professional feature and so NVidia offers it only on their Quadro cards, even if the GeForce GPUs would do it as well. This is not a recent development. Already back in 1999 it was like this. For example I had (well still have) a GeForce2 Ultra back then. But technically this was the very same chip like the Quadro, the only difference was the PCI-ID reported back to the system. One could trick the driver into thinking you had a Quadro by tinkering with the PCI-IDs (either by patching the driver or by soldering an additional resistor onto the graphics card PCB).
The stereoscopic 3D mode for Direct3D hack was already supported by my GeForce2 then. Back then the driver duplicated the rendering commands, but applied a translation to the modelview and a skew to the projection matrix. These days it's implemented a shader and multi rendertarget trick.
The NVision3D API does allow you to blit images for specific eyes (this is meant for movie players and image viewers). But it also allows you to emulate quadbuffer stereo: Instead of GL_BACK_LEFT and GL_BACK_RIGHT buffers create two Framebuffer Objects, which you bind and use as if they were quadbuffer stereo. Then after rendering you blit the resulting images (as textures) to the NVision3D API.
With only as little as 50 lines of management code you can build a program that seamlessly works on both NVision3D as well as quadbuffer stereo. What NVidia does is pointless and they should just stop it now and properly support quadbuffer stereo pixelformats on consumer GPUs as well.
Simple: you can't. Not the way you're trying to do it.
There is a difference between having a pre-existing program do things with stereoscopic glasses and doing what you're trying to do. What you are attempting to do is use the built-in stereo support of OpenGL: the ability to create a stereoscopic framebuffer, where you can render to the left and right framebuffers arbitrarily.
NVIDIA does not allow that with their non-Quadro cards. It has hacks in the driver that will force stereo on applications with nVision and the control panel. But NVIDIA's GeForce drivers do not allow you to create stereoscopic framebuffers.
And before you ask, no, I have no idea why NVIDIA doesn't let you control stereo.
Since I was looking into this issue for my own game, I w found this link where somebody hacked the USB protocol. http://users.csc.calpoly.edu/~zwood/teaching/csc572/final11/rsomers/
I didn't follow it through but at the time when I was researching on this it didn't look to hard to make use of this information. So you might have to implement your own code in order to support it in your app, which should be possible. Unfortunately a generic solution would be harder, because then you would have to hack the driver or somehow hook into the OpenGL library and intercept the calls.
I just picked up a new Lenovo Thinkpad that comes with Intel HD Graphics 3000. I'm finding that my old freeglut apps, which use GLUT_MULTISAMPLE, are running at 2 or 3 fps as opposed to the expected 60fps. Even the freeglut example 'shapes' runs this slow.
If I disable GLUT_MULTISAMPLE from shapes.c (or my app) things run quickly again.
I tried multisampling on glfw (using GLFW_FSAA - or whatever that hint is called), and I think it's working fine. This was with a different app (glgears). glfw is triggering Norton Internet Security, which things it's malware so keeps removing .exes... but that's another problem... my interest is with freeglut.
I wonder if the algorithm that freeglut uses to choose a pixel format is tripping up on this card, whereas glfw is choosing the right one.
Has anyone else come across something like this? Any ideas?
That glfw triggeres Norton is a bug in Nortons virus definition. If it's still the case with the latest definitions, send them your glfw dll/app so they can fix it. Same happens on Avira and they are working on it (have already confirmed that it's a false positive).
As for the HD3000, that's quite a weak GPU, what resolution is your app and how many samples are you using? Maybe the amount of framebuffer memory gets to high for the little guy?