Can't figure out why this OpenGL program is not rendering - opengl

I compiled Joe Groff's "An intro to modern OpenGL: Hello World: The Slideshow.
I have compiled it using Mingw-w64 with freeglut, Glut 3.7 and a version that makes my own context.
However, when I run the program, the image doesn't fade back and forth like its supposed to and I can't figure out why (spent a whole day on it).
Also, I have examined most of inputs and outputs except for the shaders and cant find anything wrong, anyone have any ideas?

Most likely, your OpenGL version doesn't support shaders. Are you by any chance running in a virtual machine or via remote desktop? These tend to only support OpenGL 1.1 even if the graphics card/drivers are much more recent, and OpenGL 1.1 does not support shaders. It's also possible that if you're using an older laptop with an integrated Intel GPU that shaders are not (properly) supported.

Related

OpenGL GLSL shader versions

Recently I have had some problems with GLSL shader versions on different computers. I know every GPU can have different support for shaders, but I don't know how to make one shader which will work on all GPU's. If I write some shaders on my PC (GPU - AMD HD7770) I don't even have to specify the version, but on some older PC's or on PS's with nVidia GPU it's more strict on the version, so I've to specify the version that the GPU supports.
Now here comes the real problem. If I specify e.g version 330 on my PC, it works as it should, but on other PC's which should support version 330 it does not seem to work. So I have to rewrite it and make it work. And if I switch back to my PC which has newer GPU, it doesn't work either.
Does anyone know, how do I have to write the shader so it can run on all GPU's?
Writing portable OpenGL code isn't as straightforward as you might like.
nVidia drivers are permissive. You can get away with a lot of things on nVidia drivers that you can't get away with on other systems.
It's easy to accidentally use extra features. For example, I wrote a program targeting the 3.2 core profile, but used GL_INT_2_10_10_10_REV as a vertex format. The GL_INT_2_10_10_10_REV symbol is defined in 3.2, but it's not allowed as a vertex format until 3.3, and you won't get any error messages for using it by accident.
Lots of people run old drivers. According to the Steam survey, in 2013, 38% of customers with OpenGL 3.x drivers didn't have 3.3 support, even though hardware which supports 3.0 should support 3.3.
You will always have to test. This is the unfortunate reality.
My recommendations are:
Always target the core profile.
Always specify shader language version.
Check the driver version and abort if it is too old.
If you can, use OpenGL headers/bindings that only expose symbols in the version you are targeting.
Get a copy of the spec for the target version, and use that as a reference instead of the OpenGL man pages.
Write your code so that it can also run on OpenGL ES, if that's feasible.
Test on different systems. One PC is probably not going to cut it. If you can dig up a second PC with a graphics card from a different vendor (don't forget Intel's integrated graphics), that would be better. You can probably get an OpenGL 3.x desktop for a couple hundred dollars, or if you want to save the money, ask to use a friend's computer for some quick testing. You could also buy a second video card (think under $40 for a low-end card with OpenGL 4.x support), just be careful when swapping them out.
The main reason that commercial games run on a variety of systems is that they have a QA budget. If you can afford a QA team, do it! If you don't have a QA team, then you're going to have to do both QA and development -- two jobs is more work, but that's the price you pay for quashing bugs.

Nvidia Nsight 4.0 cannot profile code in OpenGL 4.3

I am using Visual Studio 13 with Nvidia NSights 4.0. In my application I am doing a mix of different types of rendering but, for the purpose of testing the proiler, I did a simple rendering of a scene. I opened the graphics debugger and, when I open the GUI and press spacebar to capture the frame I get this error:
Cannot enter frame debugger. Nsight only supports frame debugging for
D3D9, D3D10, D3D11, and OpenGL 4.2.
Reason: glEnd
I am using a GT540m and I checked my OpenGL version and it is 4.3
If I, then, try to use the performance anaysis tool and trace OpenGL (following the instructions) I always get some percentage of CPU frames and 0 GPU frames.
I have no idea what am I doing wrong. Is there any solution to this or alternative ways to profile OpenGL?
Are you using immediate mode drawing? Ie. glBegin(..); glVertex<> ; glEnd()
From the Nsight User Guide's Supported OpenGL Functions page:
NVIDIA® Nsight™ Visual Studio Edition 4.0 frame debugging supports the set of OpenGL operations, which are defined by the OpenGL 4.2 core profile. Note that it is not necessary to create a core profile context to make use of the frame debugger. An application that uses a compatibility profile context, but restricts itself to using the OpenGL 4.2 core subset, will also work. A few OpenGL 4.2 compatibility profile features, such as support for alpha testing and a default vertex array object, are also supported.
So, replace the immediate mode rendering with newer drawing functions like glDrawArrays and glDrawElements that vertex array objects.
Better yet, create a core profile context to ensure you aren't using deprecated functionality.
My advice: stay away from outdated tutorials online and read the latest edition of the Red book (OpenGL Programming Guide), which only covers modern OpenGL.
You can also try the more basic GPUView tool that can be found in Win 8 SDK
UPDATE:
As for why 0 GPU frames are retrieved, are you sure that your GPU is on the list of supported hardware. I had the same problem where NSight was mostly working (was able to profile other aspects) but 0 GPU frames were collected. Later realized that my card was not officially supported.
Now available Nsight 4.5 RC1, works with cuda sdk 7 RC, and among its features, now support openGL 4.3 !

How do I get OpenGL 3.3?

Ok so I'm trying to use the tutorials at: http://arcsynthesis.org/gltut/ but I keep getting an error message that pops for like a second saying "Unable to create OpenGL 3.3 context (flags 1, profile 1)", there's also a bunch of pdb files missing. I did download the newest drivers for both graphics cards on my laptop (that is both the Intel(R) HD Graphics 3000 and the NVIDIA GeForce GT 540M) and I did launch a software called "OpenGL Extensions Viewer", and it displays that I should be able to run OpenGL version 3.1
NOW, I guess is that some would now say that perhaps my card can't run 3.3, but:
1) My card is said to support 4.0 :
http://www.notebookcheck.net/NVIDIA-GeForce-GT-540M.41715.0.html
2) There are people who say that "Any hardware that supports OpenGL 3.1 is capable of supporting OpenGL 3.3. "
OpenGL 3.+ glsl compatibility mess?
3) And finally... A YEAR AGO, I GOT IT TO RUN! Seriously, I got it to work after 2 months of trying. I'm even using some old project files from that time and they sadly won't launch anymore because of the same mistake... I did format since then.
I recall that last time, it was a whole series of things that I tried... like disabling one graphic card to be able to update the other... or maybe it was that I used some different diagnostic, which someone online advised saying that "if that program detects that the OpenGL isn't working properly, it'll fix it".
Right now I'm busy with other homework, so if anyone at all has any suggestions what this could be about, please tell!

I need openGl 2.0 but my graphic card support 1.5

I want to start with my webGL project and minimal require is my graphic card support openGL 2.0.
Problem exist because i have intel laptop with integrated intel 965 graphic media accelerator and driver is up to date and it support openGL 1.5.
Is there any solution how to update my graphic carf to support 2.0? Is this possible?
Okay. just stay patient actually because ANGLE is coming. It seems to me that your hardware is able to run directX 9 and ANGLE is a project from google to allow webgl support from directX. But as the others say, you can't upgrade opengl drivers just like that. Or you could try MESA in the firefox build.
For more information, see Learningwebgl.com.
Sadly no. With a little more effort you can still develop against opengl 2.0 but you'll need to use another machine (or just buy a better graphics card) to test anything 2.0 specific (pixel shading for instance).
Ok, that's not entirely true. You could download the mesa library and compile it for win32 and get some of the opengl 2.0 functionality emulated in a software renderer but it would be very slow.
It's possible that updating drivers might help some, but probably won't make that jump. Otherwise, you could use something like Mesa3D, which does the rendering in software. It can be slow, but does support up through OpenGL 2.1 (including shaders), if memory serves.
If there's no other way, you could try http://www.mesa3d.org/ . I haven't followed this project for quite some time, but apparently they currently provide OpenGL 2.1 software rendering.
I just updated drivers my HP 6710b with Mobile Intel 965 Express Chipset -- and now WebGL is working in Firefox 4 RC1!
I put instructions on this site.
It is not pretty but it works!
angleproject is your best bet. Check out which exact 965 card you have from here (search for 'intel gma' in wikipedia), which also lists the OpenGL support version for these cards. It might take a couple of months though before you can use angleproject to accelerate your WebGL application.
I have a slightly newer 4500MHD, and I have the same problem. WebGL works on Firefox 3.7a4, but fails in the later versions a5 and a6. I had to use the latest drivers from Intel which claim to support OpenGL 2.0. The Microsoft drivers don't ship with OpenGL support.
I have reported a issue in the Firefox https://bugzilla.mozilla.org/show_bug.cgi?id=570474. It looks like support for Intel cards might be fixed by the time the releases are in beta.

Lighting issue in OpenGL

I have trouble developing an OpenGL application.
The weird thing is that me and a friend of mine are developing a 3d scene with OpenGL under Linux, and there is some code on the repository, but if we both checkout the same latest version, that means, the SAME code this happens: On his computer after he compiles he can see the full lighting model, whilst on mine, I have only the ambient lights activated but not the diffuse or specular ones.
Can it be a problem of drivers ?(since he uses an ATi card and I use an nVIDIA one)
Or the static libraries ?
I repeat, it is the same code, compiled in different machines.. that's the strange thing, it should look the same.
Thanks for any help or tip given.
This can very easily be a driver problem, or one card supporting extensions that the other does not.
Try his binaries on your machine. If it continues to fail, either your drivers are whack or you're using a command not supported by your card. On the other hand if your screen looks right when using your code compiled on his machine, then your static libraries have a problem.