My system's default version for OpenGL and GLSL using freeglut is 4.1, also using glew there is no problem with its initialization, shader compilation and linking, and execution.
This default version happens when I don't specify glutInitContextVersion, glutInitContextFlags or glutInitContextProfile, then my shaders work correct.
Regardless I have support to this version, I would like to provide a 3.3 alternative. When I use the glut context specifying the 3.3 version, the application starts with no errors, glew doesn't complain. My shaders are suppose to use the GLSL 3.3 version, then I set the line
#version 330
But when my shaders are compiled, OpenGL complains with an invalid enumerant message. I tried removing this line or setting it to another version but I still get the same error. After all initialization has been properly done, i ask for the OpenGL version and I get the right 3.3, but for the GLSL version am still getting the default 4.1.
glGetString(GL_SHADING_LANGUAGE_VERSION);
Please correct me if am right, I guess this version is overwritten by the version line in the shaders code. I wonder if there is a chance to initialize the context specifying the GLSL version too?
I finally solved it. If I set the core profile along with the compatibility mode
glutInitContextProfile(GLUT_CORE_PROFILE | GLUT_COMPATIBILITY_PROFILE);
OpenGL commands from 2.1 version won't be recognized. This goes against the information provided in the tutorials. Anyways by only setting the GLUT_COMPATIBILITY_PROFILE it works, using 3.3 or greater version.
Related
From my understanding, modern OpenGL starts with version 3.3 which is mostly the same as 4.0. Both versions were released at the same time and 3.3 is just for back compatibility with older graphics cards. In my application I need to use tessellation control. All of the sources I have found online say tessellation starts with 4.0, but also say 3.3 is virtually the same.
Does OpenGL 3.3 have tessellation or not? Should I even bother with 3.3 in 2021 or should I just use the latest version 4.6? Is 4.6 compatible with mac?
Tessellation shaders are in the standard since OpenGL 4.0. However you can test if you hardware supports the ARB_tessellation_shader extension, which is written against the OpenGL 3.2.
I'm trying to get a very basic openGL application up and running. I have a GTX 770, and i've installed nvidia-361 drivers. when i run glxinfo | grep version, i get:
OpenGL core profile version string: 4.5.0 NVIDIA 361.42
OpenGL core profile shading language version string: 4.5.0 NVIDIA
OpenGL version string: 4.5.0 NVIDIA
this would lead one to believe that your drivers supported OpenGL 4.5, right?
now, i'm using GLEW in my basic application. i get the version string and print it:
const GLubyte* version = glGetString(GL_VERSION);
printf("version: %s\n", version);
and when i run the program, i get:
version: 3.2.0 NVIDIA 361.42
?????????????????????????
what's happening here? i checked my version of libglew-dev, and it's 1.13.0. OpenGL 4.5 support was added in 1.11.0. so i don't think GLEW is the problem, but i can't figure out what's going on.
glGetString(GL_VERSION) returns the version the current GL context is providing, not necessarily the highest version your GL implementation is supporting.
GLEW has nothing to do with that, it just loads GL function pointers. Relevant is the way you created the context. What you see here is the normal behavior of the nvidia driver in recent versions: when you ask it for some GL x.y context, it does return version x.y, and not a higher version it would still support.
If you want a 4.5 context, just request a GL 4.5 context. How to do that depends on the way you create the context. If you use some libraries like GLFW, GLUT, SDL, Qt, ..., just consult the documentation on how to request a specific context version. If you manually create the context via glX, use glXCreateContextAttribsARB with proper GLX_CONTEXT_MAJOR_VERSION_ARB and GLX_CONTEXT_MINOR_VERSION_ARB attributes.
I am trying to get an OpenGL/glew program from a template made by my uni lecturer work. He has added this code to his program:
if(!GLEW_VERSION_3_1) {
std::cerr << "Driver does not support OpenGL 3.1" << std::endl;
return 1;
}
This prints the error on my mac. After some experimenting, I have found out that my mac is actually running OpenGL (2.1 INTEL-8.28.30). I have worked my way around this using a nasty #ifndef APPLE before that part of the code, but I cannot do this in the long term.
Is there any way in which I can upgrade to 3.1?
The issue is related to the concept of core and compatibility contexts in OpenGL, and Apple's implementation. You need to make sure the application requests a core profile OpenGL context.
Apple support a compatibility OpenGL context (with support for all of the old deprecated features) up until a maximum of OpenGL 2.1 (plus several extensions).
Apple support a core OpenGL context (with old deprecated features removed) up until a maximum of OpenGL 4.1 (dependent on hardware support, 3.3 on older hardware) with OSX Mavericks. If you download a program called OpenGL Extension Viewer, you can see what you've got available.
When your program creates an OpenGL context, it gets compatibility by default (to avoid breaking old programs). You need to specifically flag that you want a core profile. The way to do this will depend what windowing library you're using. If you happen to be using SDL2, there's just an extra flag to set when creating the context. If using Apple GL directly, you'd need to check their documentation.
See: http://www.opengl.org/wiki/Core_And_Compatibility_in_Contexts
Note: I notice you're using GLEW. I encountered issues with GLEW and core contexts in the past. This is because it was requesting extension strings using a method that's deprecated (and therefore fails in a core context). It made it look like no extensions were supported. If this happens, refer to the GLEW website, there was an experimental option you could pass to its init to make it work.
I queried my opengl version using glGetString(GL_VERSION) and it returned 3.3.0. As far as I know, 3.3 does have the GL_TEXTURE_RECTANGLE flag. However I'm unable to find it in my gl.h header nor does my compiler(VC++) recognize the flag. All my other openGL calls seem to run fine. Any ideas?
Unfortunately, Visual Studio does not ship with up-to-date OpenGL support. Check out GLEW for a solution to this problem.
I have shaders written in GLSL version 150, so I believe I need to get an OpenGL 3.2 context.
If I do:
glutInitDisplayMode(GLUT_3_2_CORE_PROFILE | ...)
then the OpenGL version is 4.1, if I don't then the version is 2.1.
If the version is 4.1, then GLEW_ARB_vertex_shader is false which the code checks for before it starts.
The ARB_vertex_shader is an extension of the compatibility profile that allows you to use a vertex shader with old versions of OpenGL. You don't need this extension with the core profile. I think your code can run safely without this check.
OpenGL 4.1 should allow you to do anything you would do with OpenGL3.2.
Chances are, the code you use was not written in a OSX environment. Other OS allows to activate both Core and Compatibility profile at the same time. This is not the case with OSX. This suggest that your code depends on this feature so it might be tricky to get it to work on OSX.