Which GLSL versions to use for maximum compatibility - opengl

Update 2
I think my question might really be, "How do I write OpenGL code that will work on any machine that supports at least OpenGL 2.0?" Bearing in mind that a compatibility profile is not guaranteed to be available.
Update 1:
Rabbid76 is suggesting to either use a specific core profile or use a compatibility profile.
The compatibility profile isn't guaranteed to be available so let's say I choose to use OpenGL 3.3 core profile as the minimum requirement. Wouldn't it still be the case that I would have to write three different versions of the GLSL shaders?
3.30 For OpenGL 3.3
4.00 For OpenGL 4.0
4.10 For OpenGL 4.1 and onwards
From reading around people seem to assume that GLSL 3.30 works on OpenGL 3.3 and all later versions, but that doesn't seem to be guaranteed by any spec before OpenGL 4.2.
Is this right?
Original question:
I've read various versions of the OpenGL spec I can't see any useful guarantees about backwards compatibility of GLSL support.
Here's what I've deduced from the specs:
OpenGL 2.0 supports GLSL 1.10
OpenGL 2.2 supports GLSL 1.20
OpenGL 3.0 supports GLSL 1.10, 1.20 & 1.30
OpenGL 3.1 supports GLSL 1.30 & 1.40
OpenGL 3.2 supports GLSL 1.40 & 1.50
OpenGL 3.3 supports GLSL 3.30
OpenGL 4.0 supports GLSL 4.00
OpenGL 4.1 supports GLSL 4.10
OpenGL 4.2 supports GLSL 4.20 and all versions back to 1.40
From OpenGL 4.2 onwards it seems that support back to GLSL version 1.40 is guaranteed.
Note that guarantees of backwards compatibility are quite rare until OpenGL 4.2!
So if I want to ship an application which supports all versions of OpenGL back to 2.0 I would need to write every shader in the following versions:
1.10 For OpenGL 2.0
1.20 For OpenGL 2.2 & 3.0
1.40 For OpenGL 3.1, 3.2 & 4.2 onwards
3.30 For OpenGL 3.3
4.00 For OpenGL 4.0
4.10 For OpenGL 4.1
That's six versions of each shader! Is there an easier way to make GLSL portable?
Why are the specs so unhelpful in this regard?

Wouldn't it still be the case that I would have to write three different versions of the GLSL shaders?
No.
In order to use the core 3.3 profile, you must ask the WGL/GLX/EGL system that creates OpenGL to give you what you asked for. That is, it must give you core 3.3 or any later implementation that is compatible with core 3.3.
In order for an implementation to be compatible with 3.3, it must accept GLSL of version 3.30. So if you request a core 3.3 context, and you get a valid context of a higher version, that implementation will support 3.30 shaders.
That is, just because a particular standard version doesn't mandate compatibility with older shader versions does not mean the implementation won't provide it.
How do I write OpenGL code that will work on any machine that supports at least OpenGL 2.0?
You can't. At least, not without distinct codepaths.
Some implementations do not support the compatibility profile at all. As such, if you do not code against at least GL 3.2 (or whatever their lowest core implementation version is), your code won't be able to work on their implementations. It just isn't possible.
Some implementations have a GL 2.x implementation as a fallback, but have a higher core profile implementation without the compatibility profile. You can access the fallback with your 2.0 code, but no extensions or core features for the newer stuff.
It should also be noted that most implementations that are limited to OpenGL 2.0 are extremely old and likely quite bug-prone. You would have to test your code extremely broadly in order to know whether or not it would work on a bunch of them. So unless you have the resources to buy and test on dozens of outdated Intel integrated chipsets, it's best to pick a version that's more recent.

Related

ARB_texture_storage vs OpenGL hardware version

ARB_texture_storage is a core feature since OpenGL 4.2. The extension had been released before OpenGL 4.2. I would like to determine what is minimum OpenGL version which has to be supported by hardware to use that extension, for example glTexStorage2DARB. The documentations says:
This extension is written against the OpenGL 3.2 Core Profile
specification.
Does it mean that GPU should support at least OpenGL 3.2 ?
ARB_texture_storage is not something that is "supported by hardware". By and large, it is an API improvement; it doesn't expose something which some GPUs can do and others cannot.
As such, in the dependencies section, the extension specification states:
OpenGL ES 1.0, OpenGL ES 2.0 or OpenGL 1.2 is required.
This represents the oldest OpenGL version with which this extension is meaningfully compatible. Of course, you are highly unlikely to find 1.2 implementations in the wild, let alone implementations that are still being supported without implementing higher GL versions.
Basically, most hardware that was/is still having its drivers maintained since the time of this extension's release will have an implementation of it. And outside of open source drivers, most of that hardware will be GL 4.x of some form.
Also, this extension does not have ARB versions of its functions. This is a compatibility extension; it allows you to use GL 4.2 API functionality on hardware that cannot support GL 4.2 (assuming the driver is updated) without forcing you to rename functions or whatever.

To be backwards compatible, are you supposed to use ARB extensions instead of core calls?

For example, I quote the wiki:
Note that glDrawTransformFeedback​ is perfectly capable of rendering from a transform feedback object without having to query the number of vertices. Though this is only core in GL 4.0, it is widely available on 3.x-class hardware
I assume this means there is an extension for it. When using an openGL library, would I want to do the normal core 4.0 call, or would I want to do an ARB extension call?
I would assume that the extension could target older hardware + newer hardware, and the 4.0 call would only target the newer hardware. Or am I safe to use 4.0 calls and then somehow the older hardware is forward compatible enough to simulate that call using the extension or something?
Extensions that are promoted to core share, among other things, the same enumerants as their equivalent core functionality.
If, for example, you look at the constants that GL_EXT_transform_feedback introduced, they are the very same as the constants without the _EXT suffix in OpenGL 3.0 (this extension was promoted to core in 3.0).
GL_RASTERIZER_DISCARD_EXT = 0x8C89 (GL_EXT_transform_feedback)
GL_RASTERIZER_DISCARD = 0x8C89 (Core OpenGL 3.0)
ARB extensions are not the only source of extensions that are promoted into core. There are EXT, NV, APPLE, ATI (AMD) and SGI extensions that are also now a part of the core OpenGL API.
Basically, if you have a version where an extension has been promoted to core, you should ask the driver for the proc. address of the function by its core name and not the extension it originated in.
The reason is pretty easy to demonstrate:
I have an OpenGL 4.4 implementation from NV that does not implement GL_APPLE_vertex_array_object even though that extension was promoted to core in OpenGL 3.0. Instead, this NV driver implements the derivative GL_ARB_vertex_array_object extension.
If your software was written to expect GL_APPLE_vertex_array_object because that is an extension that was officially promoted to core, you might get the completely wrong idea about my GPU/driver.
However, if you took a look at the context version and saw 4.4, you would know that glGenVertexArrays (...) is guaranteed to be available and you do not have to load the APPLE function that this driver knows nothing about: glGenVertexArraysAPPLE (...).
Last, regarding the statement you quoted:
Note that glDrawTransformFeedback​ is perfectly capable of rendering from a transform feedback object without having to query the number of vertices. Though this is only core in GL 4.0, it is widely available on 3.x-class hardware.
That pertains to GL_ARB_transform_feedback2. That extension does not require GL4 class hardware, but was not included as core in 3.3 when the ARB did the whole 3.3/4.0 split. If you have core OpenGL 4.0, or your driver lists this extension (as 3.3 implementations may, but are not required to), then that behavior is guaranteed to apply.
OpenGL 4.0 Core Profile Specification - J.1 New Features - pp. 424
Additional transform feedback functionality including:
transform feedback objects which encapsulate transform feedback-related state;
the ability to pause and resume transform feedback operations; and
the ability to draw primitives captured in transform feedback mode without querying captured primitive count
(GL_ARB_transform_feedback2).

Newest GLSL Spec with Least Changes?

What's the newest OpenGL GLSL specification that provides as little change to the language such that learning it won't be redundant when moving to a newer version that's also available now, for the future. As such I want to be able to make my shaders work on as much hardware as possible without learning a completely deprecated language.
It depends on how you define "redundant".
If you're purely talking about the core/compatibility feature removal, that only ever happened once, in the transition from OpenGL 3.0 to 3.1 (in GLSL version terms, 1.30 to 1.40).
Every shader version from 1.40 onward will be supported by any OpenGL implementation. Every shading language version from 1.10 onward will be supported by any compatibility profile implementation.
If by "redundant", you mean that you don't want to have to learn new grammar to access language changes that don't affect new hardware (separate programs, explicit attribute and uniform specifications, etc, all of which have zero hardware dependencies), tough. Pick your version based on whatever minimum hardware you want to support and stick with it.

ARB_shader_image_load_store - what version of OpenGL and what hardware is required?

I'm a bit confused about new OpenGL extensions, what hardware they require and what OpenGL version they require.
In particular it's about the ARB_shader_image_load_store now.
http://www.opengl.org/registry/specs/ARB/shader_image_load_store.txt
As I understand, this is a feature of OpenGL 4.2 but in the OpenGL dependencies it's written:
This extension is written against the OpenGL 3.2 specification
(Compatibility Profile).
This extension is written against version 1.50 (revision 09) of the OpenGL
Shading Language Specification.
OpenGL 3.0 and GLSL 1.30 are required.
and further down stuff like
This extension interacts trivially with OpenGL 4.0 and ARB_sample_shading.
What do these things mean? What Hardware and what OpenGL version is necessary to use such Extensions?
What do these things mean?
Well, let's take them one by one.
Before we begin, some basic information. OpenGL specifications, whether core or extension, do not care about what hardware something runs on. They're not interested in that. They don't define hardware. You can't look at an extension spec and know a priori what hardware it will function on. If you want to find that information out, you're looking in the wrong place.
Furthermore, you have to understand something about extension specifications. An OpenGL extension is like a diff; you can't read it in isolation. An OpenGL extension is a document that modifies the OpenGL specification.
This extension is written against the OpenGL 3.2 specification
(Compatibility Profile).
This extension is written against version 1.50 (revision 09) of the OpenGL
Shading Language Specification.
A diff file is utterly useless unless you know exactly what file to patch it into, yes? That's the same thing with OpenGL. The extension specification will make references to section and paragraph numbers in the OpenGL specification. But... there are many versions of the OpenGL specification. Which one is it talking about?
Therefore, every extension must state which physical document it is referring to. So when this extension says, "Add a new subsection after Section 2.14.5, Samplers, p. 106", it means page 106, section 2.14.5 of the OpenGL 3.2 specification, compatibility profile.
Same goes for the GLSL language specification.
OpenGL 3.0 and GLSL 1.30 are required.
Now, just because an extension is written against a particular version does not mean that this is the minimum version where support for the extension is possible. An implementation could theoretically support it in an earlier version.
This statement says what the minimum version that can possibly support it is.
This is not a matter of hardware; it is a matter of language. The reason 3.0 is the minimum is because this extension refers to concepts that are simply not available in 2.1. Such as integer image formats and so forth. We'll talk a bit more about this in the next part.
This extension interacts with X.
The "interacts with" statement speaks to optional parts of the specification. What it means is that if "X" and this extension are both supported, then certain paragraphs in this specification also exist.
For example, ARB_shader_image_load_store states, "This extension interacts with ARB_separate_shader_objects". If you look towards the bottom, you will find a section titled "Dependencies on ARB_separate_shader_objects". That lists the specific language that changes when ARB_separate_shader_objects is available.
The "interacts trivially with X" statement simply means that the interaction is generally a "remove references to X" statement. For example, the section on ARB_tessellation_shader/4.0 dependencies state, "If OpenGL 4.0 and ARB_tessellation_shader are not supported, references to tessellation control and evaluation shaders should be removed."
The "trivially" language is just the extension's way of saying, "if X isn't supported, then obviously any references to the stuff X implements should be ignored."
The interaction with ARB_separate_shader_objects isn't trivial because it involves redefining how early depth test works.
The "interacts with" is an alternative to the "are required" wording. The ARB could have simply written it against 4.1 and firmly stated that 4.1 is required. Then there wouldn't have been nearly as many "interacts with" clauses, since none of those things are optional.
However, the ARB wanted to allow for the possibility of hardware that could support GL 3.0 concepts but not others. For example, in the mobile space, shader_image_load_store support could come before tessellation_shaders. That's why this extension has a lot of "interacts with" clauses and a fairly low "required" GL version. Despite the fact that on desktops, you will not find any implementation of ARB_shader_image_load_store paired with a version number less than 4.0.
What Hardware and what OpenGL version is necessary to use such Extensions?
None of these documents will tell you that. ARB_shader_image_load_store could be available on any implementation version 3.0 or greater.
The easiest and simplest way to find out what hardware supports what extensions is to use the OpenGL Viewer. It has a pretty up-to-date database of this information.
Alternatively, you can use some common sense. ARB_separate_shader_objects allows you to mix and match programs on the fly. This is something D3D has been doing since Direct3D 8. Obviously hardware could do it since shaders came into being; OpenGL simply didn't let you. Until now.
Obviously ARB_separate_shader_objects is not hardware-based.
Similarly ARB_shading_language_pack420 contains many features that D3D has had since forever. Again, there's clearly nothing there that requires specialized hardware support.
ARB_tessellation_shader is obviously something that does require specialized hardware support. It introduces two new shaders stages. ARB_shader_image_load_store is the same way: it introduces a fundamental new hardware ability. Now, it is certainly possible that earlier hardware could have done it. But that seems unlikely.
This isn't always the case for every extension. But it is mostly true.
The other thing you should know about is OpenGL version numbers. Since 3.0, the ARB has been good about keeping to a strict version numbering scheme.
Major versions represent fundamental hardware changes. 3.x to 4.x is directly equivalent to D3D10 to D3D11. Minor versions are either making the API nicer (see ARB_texture_storage, something we were long overdue for) or exposing previously unexposed hardware features for the same hardware level (ARB_shader_image_load_store could have been implemented on any 4.0 implementation, but the ARB just took until 4.2 to write the extension).
So if you have hardware that can run 3.0, it can also run 3.3; if it doesn't have drivers for it, then your driver maker isn't doing their job. Same goes for 4.0 to 4.2.
These things mean how to correctly read extension specification. To read specification you must know how extension interacts with other OpenGL features. And by saying "extension interacts trivially with OpenGL 4.0", it means you must thing of core OpenGL 4.0 feature ARB_sample_shading, and how this extension changes its behaviour.
Because it is in OpenGL Core 4.2 specification, ARB_shader_image_load_store should be supported on all Shader Model 5 / Direct3D11 hardware (Radeon HD5xxx and up, GeForce 400 and up).
So you can check if your OpenGL version is >=4.2 or check ARB_shader_image_load_store extension presence.

Should I use EXT methods/constants in OpenGL <2.0

I'm coding a 3D game for PC with pretty low minimum system requirements in mind (anything better than Pentium II 400MHz and GeForce3 should be ok). I read in docs that this or that function started as EXT and ended up being included into OpenGL core in version 1.3 or 1.4.
In dglOpenGL headers there are both glBindFramebuffer and glBindFramebufferEXT methods with GL_FRAMEBUFFER and GL_FRAMEBUFFER_EXT constants. My question is - which version should I be using EXT or noEXT?
Is it possible that some Intel built-in GPU whose drivers meet only version 1.3 will accept glMethodExt and will crash upon the same glMethod (without EXT in the end)?
You should use what is available on that implementation. A core feature will be denoted by a version number. If you're expecting core FBO support, you would need to get a version 3.0 or greater.
Extension support is denoted by the extension string. You should check for available extensions at startup and you should not use any extension that isn't there.
Now, there are some ARB extensions which are "core extensions". This means that the #defines and functions do not have the ARB suffix. So ARB_framebuffer object is an extension, but it defines glBindFramebuffer, without a suffix. This means that you can check for version 3.0 or the extension, and in either case, you use the same functions and #defines.
Core extensions almost always mean the exact same thing as the core equivalent. Non-core extensions can have changes. For example, ARB_geometry_shader4 is not a core extension, and the core geometry shader functionality from 3.2 is vastly different in terms of specification and API.
You generally should have some minimum OpenGL version that you accept, and then run different codepaths based on the presence of higher GL versions and/or extensions.