Lighting issue in OpenGL - c++

I have trouble developing an OpenGL application.
The weird thing is that me and a friend of mine are developing a 3d scene with OpenGL under Linux, and there is some code on the repository, but if we both checkout the same latest version, that means, the SAME code this happens: On his computer after he compiles he can see the full lighting model, whilst on mine, I have only the ambient lights activated but not the diffuse or specular ones.
Can it be a problem of drivers ?(since he uses an ATi card and I use an nVIDIA one)
Or the static libraries ?
I repeat, it is the same code, compiled in different machines.. that's the strange thing, it should look the same.
Thanks for any help or tip given.

This can very easily be a driver problem, or one card supporting extensions that the other does not.
Try his binaries on your machine. If it continues to fail, either your drivers are whack or you're using a command not supported by your card. On the other hand if your screen looks right when using your code compiled on his machine, then your static libraries have a problem.

Related

What's causing this unpredictable OpenGL bug?

I have an OpenGL test application that is producing incredibly unusual results. When I start up the application it may or may not feature a severe graphical bug.
It might produce an image like this:
http://i.imgur.com/JwPoDrh.jpg
Or like this:
http://i.imgur.com/QEYwhBY.jpg
Or just the correct image, like this:
http://i.imgur.com/zUJbwCM.jpg
The scene consists of one spinning colored cube (made of 12 triangles) with a simple shader on it that colors the pixels based on the absolute value of their model space coordinates. The junk faces appear to spin with the cube as though they were attached to it and often junk triangles or quads flash on the screen briefly as though they were rendered in 2D.
The thing I find most unusual about this is that the behavior is highly inconsistent, starting the exact same application repeatedly without me personally changing anything else on the system will produce different results, sometimes bugged, sometimes not, the arrangement of the junk faces produced isn't consistent either.
I can't really post source code for the application as it is very lengthy and the actual OpenGL calls are spread out across many wrapper classes and such.
This is occurring under the following conditions:
Windows 10 64 bit OS (although I have observed very similar behavior under Windows 8.1 64 bit).
AMD FX-9590 CPU (Clocked at 4.7GHz on an ASUS Sabertooth 990FX).
AMD 7970HD GPU (It is a couple years old and occasionally areas of the screen in 3D applications become scrambled, but nothing on the scale of what I'm experiencing here).
Using SDL (https://www.libsdl.org/) for window and context creation.
Using GLEW (http://glew.sourceforge.net/) for OpenGL.
Using OpenGL versions 1.0, 3.3 and 4.3 (I'm assuming SDL is indeed using the versions I instructed it to).
AMD Catalyst driver version 15.7.1 (Driver Packaging Version listed as 15.20.1062.1004-150803a1-187674C, although again I have seen very similar behavior on much older drivers).
Catalyst Control Center lists my OpenGL version as 6.14.10.13399.
This looks like a broken graphics card to me. Most likely some problem with the memory (either the memory itself, or some soldering problem). Artifacts like those you see can happen if for some reason setting the address for a memory operation does not fully settle or happen at all, before starting the read; that can happen due to a bad connection between the GPU and the memory (solder connections failed) or because the memory itself failed.
Solution: Buy new graphics card. You may try out what happens if you resolder it using a reflow process; there are some tutorials on how to do this DIY, but a proper reflow oven gives better results.

OpenGL doesn't render textures

Yesterday my OpenGL Classes have started, and me and my classmates downloaded a project made by my teacher as a sample project to learn from, on their pc it worked perfectly but all I'm seeing is white, the objects are white and the ground is white, on other classmates their pc's the objects are rendered correctly.
Does anyone know why it won't render the textures on my pc?
I've tried a few things such as, turning my Intel Graphics card off and running it on my Nvidia card, and i've tried to turn my Nvidia card off to run it on my Intel but neither worked.
My Nvidia card is a Nvidia Geforce 710M which supports OpenGL 4.5, and I have OpenGL version 4.5.
I'm not getting any errors in the code.
I'am able to see shadows I've just tried a different project and only saw shadows, the rest was all white, as it was with the first project.
I've re-installed the required libraries again, then rebooted my pc and it worked :D thank you for all your help.

OpenGL GLSL shader versions

Recently I have had some problems with GLSL shader versions on different computers. I know every GPU can have different support for shaders, but I don't know how to make one shader which will work on all GPU's. If I write some shaders on my PC (GPU - AMD HD7770) I don't even have to specify the version, but on some older PC's or on PS's with nVidia GPU it's more strict on the version, so I've to specify the version that the GPU supports.
Now here comes the real problem. If I specify e.g version 330 on my PC, it works as it should, but on other PC's which should support version 330 it does not seem to work. So I have to rewrite it and make it work. And if I switch back to my PC which has newer GPU, it doesn't work either.
Does anyone know, how do I have to write the shader so it can run on all GPU's?
Writing portable OpenGL code isn't as straightforward as you might like.
nVidia drivers are permissive. You can get away with a lot of things on nVidia drivers that you can't get away with on other systems.
It's easy to accidentally use extra features. For example, I wrote a program targeting the 3.2 core profile, but used GL_INT_2_10_10_10_REV as a vertex format. The GL_INT_2_10_10_10_REV symbol is defined in 3.2, but it's not allowed as a vertex format until 3.3, and you won't get any error messages for using it by accident.
Lots of people run old drivers. According to the Steam survey, in 2013, 38% of customers with OpenGL 3.x drivers didn't have 3.3 support, even though hardware which supports 3.0 should support 3.3.
You will always have to test. This is the unfortunate reality.
My recommendations are:
Always target the core profile.
Always specify shader language version.
Check the driver version and abort if it is too old.
If you can, use OpenGL headers/bindings that only expose symbols in the version you are targeting.
Get a copy of the spec for the target version, and use that as a reference instead of the OpenGL man pages.
Write your code so that it can also run on OpenGL ES, if that's feasible.
Test on different systems. One PC is probably not going to cut it. If you can dig up a second PC with a graphics card from a different vendor (don't forget Intel's integrated graphics), that would be better. You can probably get an OpenGL 3.x desktop for a couple hundred dollars, or if you want to save the money, ask to use a friend's computer for some quick testing. You could also buy a second video card (think under $40 for a low-end card with OpenGL 4.x support), just be careful when swapping them out.
The main reason that commercial games run on a variety of systems is that they have a QA budget. If you can afford a QA team, do it! If you don't have a QA team, then you're going to have to do both QA and development -- two jobs is more work, but that's the price you pay for quashing bugs.

Can't figure out why this OpenGL program is not rendering

I compiled Joe Groff's "An intro to modern OpenGL: Hello World: The Slideshow.
I have compiled it using Mingw-w64 with freeglut, Glut 3.7 and a version that makes my own context.
However, when I run the program, the image doesn't fade back and forth like its supposed to and I can't figure out why (spent a whole day on it).
Also, I have examined most of inputs and outputs except for the shaders and cant find anything wrong, anyone have any ideas?
Most likely, your OpenGL version doesn't support shaders. Are you by any chance running in a virtual machine or via remote desktop? These tend to only support OpenGL 1.1 even if the graphics card/drivers are much more recent, and OpenGL 1.1 does not support shaders. It's also possible that if you're using an older laptop with an integrated Intel GPU that shaders are not (properly) supported.

freeglut GLUT_MULTISAMPLE very slow on Intel HD Graphics 3000

I just picked up a new Lenovo Thinkpad that comes with Intel HD Graphics 3000. I'm finding that my old freeglut apps, which use GLUT_MULTISAMPLE, are running at 2 or 3 fps as opposed to the expected 60fps. Even the freeglut example 'shapes' runs this slow.
If I disable GLUT_MULTISAMPLE from shapes.c (or my app) things run quickly again.
I tried multisampling on glfw (using GLFW_FSAA - or whatever that hint is called), and I think it's working fine. This was with a different app (glgears). glfw is triggering Norton Internet Security, which things it's malware so keeps removing .exes... but that's another problem... my interest is with freeglut.
I wonder if the algorithm that freeglut uses to choose a pixel format is tripping up on this card, whereas glfw is choosing the right one.
Has anyone else come across something like this? Any ideas?
That glfw triggeres Norton is a bug in Nortons virus definition. If it's still the case with the latest definitions, send them your glfw dll/app so they can fix it. Same happens on Avira and they are working on it (have already confirmed that it's a false positive).
As for the HD3000, that's quite a weak GPU, what resolution is your app and how many samples are you using? Maybe the amount of framebuffer memory gets to high for the little guy?