From my understanding, modern OpenGL starts with version 3.3 which is mostly the same as 4.0. Both versions were released at the same time and 3.3 is just for back compatibility with older graphics cards. In my application I need to use tessellation control. All of the sources I have found online say tessellation starts with 4.0, but also say 3.3 is virtually the same.
Does OpenGL 3.3 have tessellation or not? Should I even bother with 3.3 in 2021 or should I just use the latest version 4.6? Is 4.6 compatible with mac?
Tessellation shaders are in the standard since OpenGL 4.0. However you can test if you hardware supports the ARB_tessellation_shader extension, which is written against the OpenGL 3.2.
Related
I'm trying to get a very basic openGL application up and running. I have a GTX 770, and i've installed nvidia-361 drivers. when i run glxinfo | grep version, i get:
OpenGL core profile version string: 4.5.0 NVIDIA 361.42
OpenGL core profile shading language version string: 4.5.0 NVIDIA
OpenGL version string: 4.5.0 NVIDIA
this would lead one to believe that your drivers supported OpenGL 4.5, right?
now, i'm using GLEW in my basic application. i get the version string and print it:
const GLubyte* version = glGetString(GL_VERSION);
printf("version: %s\n", version);
and when i run the program, i get:
version: 3.2.0 NVIDIA 361.42
?????????????????????????
what's happening here? i checked my version of libglew-dev, and it's 1.13.0. OpenGL 4.5 support was added in 1.11.0. so i don't think GLEW is the problem, but i can't figure out what's going on.
glGetString(GL_VERSION) returns the version the current GL context is providing, not necessarily the highest version your GL implementation is supporting.
GLEW has nothing to do with that, it just loads GL function pointers. Relevant is the way you created the context. What you see here is the normal behavior of the nvidia driver in recent versions: when you ask it for some GL x.y context, it does return version x.y, and not a higher version it would still support.
If you want a 4.5 context, just request a GL 4.5 context. How to do that depends on the way you create the context. If you use some libraries like GLFW, GLUT, SDL, Qt, ..., just consult the documentation on how to request a specific context version. If you manually create the context via glX, use glXCreateContextAttribsARB with proper GLX_CONTEXT_MAJOR_VERSION_ARB and GLX_CONTEXT_MINOR_VERSION_ARB attributes.
I am trying to integrate freetype_gl and the shaders they use are for an older version. I can convert them to work with 3.3+, I am just wondering if that could be casing nothing to appear on the screen.
Short answer
On Windows, it will work. On OS X and Linux/Mesa it will not work.
Long answer
If you are using the compatibility profile, yes, it will work. If you are using the core profile, then GLSL versions before 1.40 will not be supported.
OpenGL implementations on Windows tend to have strong support for legacy applications, including the compatibility profile. Other implementations, such as Mesa and the OS X implementations, do not support newer versions of the compatibility profile. For non-core profiles, OS X only supports version 2.1 and Mesa supports 3.0.
This means that if you want to run your program on Linux/Mesa or OS X, you will have to port your shader to at least 1.40 (at which point you might as well use 3.30), but you don't have to change anything if you only care about Windows.
OpenGL and GLSL versions
Every OpenGL version since 2.0 has been released with a corresponding GLSL version. However, the GLSL version numbers were not always in sync with the GL version. Here is a table:
> OpenGL Version GLSL Version
> 2.0 1.10
> 2.1 1.20
> 3.0 1.30
> 3.1 1.40
> 3.2 1.50
For all versions of OpenGL 3.3 and above, the corresponding GLSL version matches the OpenGL version. So GL 4.1 uses GLSL 4.10.
In my case, I am using glsl 1.30 with opengl 3.3 but I think you might not be able to use glsl 1.2 there
I have had a lot of problems / confusion setting up my laptop to work for OpenGL programming / the running of OpenGL programs.
My laptop has one of these very clever (too clever for me) designs where the Intel CPU has a graphics processor on chip, and there is also a dedicated graphics card. Specifically, the CPU is a 3630QM, with "HD Graphics 4000" (a very exciting name, I am sure), and the "proper" Graphics Processor is a Nvidia GTX 670MX.
Theoretically, according to Wikipedia, the HD Graphics Chip (Intel), under Linux, supports OpenGL 3.1, if the correct drivers are installed. (They probably aren't.)
According to NVIDIA, the 670MX can support OpenGL 4.1, so ideally I would like to develop and execute on this GPU.
Do I have drivers installed to enable me to execute OpenGL 4.1 code on the NVIDIA GPU? Answer: Probably no, currently I use this "optirun" thing to execute OpenGL programs on the dedicated GPU. See this link to see the process I followed to setup my computer.
My question is, I know how to run a compiled program on the 670MX; that would be 'optirun ./programname', but how can I find out what OpenGL version the installed graphics drivers on my system will support? Running 'glxinfo | grep -i opengl' in a terminal tells me that the Intel Chip supports OpenGl version 3.0. See the below information:
ed#kubuntu1304-P151EMx:~$ glxinfo | grep -i opengl
OpenGL vendor string: Intel Open Source Technology Center
OpenGL renderer string: Mesa DRI Intel(R) Ivybridge Mobile
OpenGL version string: 3.0 Mesa 9.1.3
OpenGL shading language version string: 1.30
OpenGL extensions:
How do I do the same or similar thing to find out what support is available under 'optirun', and what version of OpenGL is supported?
Update
Someone suggested I use glGetString() to find this information: I am now completely confused!
Without optirun, the supported OpenGL version is '3.0 MESA 9.1.3', so version 3, which is what I expected. However, under optirun, the supported OpenGL version is '4.3.0 NVIDIA 313.30', so version 4.3?! How can it be Version 4.3 if the hardware specification from NVIDIA states only Version 4.1 is supported?
You can just run glxinfo under optirun:
optirun glxinfo | grep -i opengl
Both cards have different features, so its normal to get different OpenGL versions.
This is the OpenGL version I have:
Video Card Vendor: Intel
Renderer: Intel(R) HD Graphics
OpenGL Version: 2.1.0 - Build 8.15.10.2622
GLU Version: 1.2.2.0 Microsoft Corporation
I'd like to learn the latest OpenGL API. But my card supports only 2.1 (and I cant update). Is it possible to program in latest API even with no card support?
AshleysBrain's answer is not quite correct. You can use a software implementation of OpenGL such as Mesa3D which can execute newer code using your CPU instead of your GPU. It will be slower but will allow you to compile and run your 4+ code against it OpenGL 3.1 code against it.
Edit: just checked, it seems Mesa only supports up to OpenGL 3.1. Still; that's pretty good for free.
It's more to do with what version the driver supports. Try updating your driver and see if it supports a newer version. Otherwise I'm afraid you're stuck with that version, there's nothing else you can do!
Send money/time Mesa's way in the hope they'll support GL4 in their software rasterizer sometime soon.
Or pester Khronos for a reference implementation.
As of Mesa 20.2 llvmpipe supports OpenGL 4.5.
Intel(R) HD Graphics are usually onboard.So NO you can't use GPU accelerated OpenGL 4.2 with this.You need dedicated card like those of Nvidia.The highest version of OpenGL I have seen on the dedicated cards is 3.1 which still delivers fixed pipeline API with Shaders support.But OpenGL 4.2 has completely removed all this and uses programmable way of doing things which requires programmable hardware too.
I have shaders written in GLSL version 150, so I believe I need to get an OpenGL 3.2 context.
If I do:
glutInitDisplayMode(GLUT_3_2_CORE_PROFILE | ...)
then the OpenGL version is 4.1, if I don't then the version is 2.1.
If the version is 4.1, then GLEW_ARB_vertex_shader is false which the code checks for before it starts.
The ARB_vertex_shader is an extension of the compatibility profile that allows you to use a vertex shader with old versions of OpenGL. You don't need this extension with the core profile. I think your code can run safely without this check.
OpenGL 4.1 should allow you to do anything you would do with OpenGL3.2.
Chances are, the code you use was not written in a OSX environment. Other OS allows to activate both Core and Compatibility profile at the same time. This is not the case with OSX. This suggest that your code depends on this feature so it might be tricky to get it to work on OSX.