I downloaded the drivers for Intel(R) HD Graphics version 8.15.10.2993 which are known to support shader 3.0. but after installation I have checked it by Geeks 3d caps viewer and calling gl.glGetString(GL2.GL_VERSION) code and showed just 2.1.0.Thanks!
GL 2.1.0 / GLSL 1.20 is fine for shader model 3. GL 3.x/GLSL >=1.30 requires shader model 4.
Related
From my understanding, modern OpenGL starts with version 3.3 which is mostly the same as 4.0. Both versions were released at the same time and 3.3 is just for back compatibility with older graphics cards. In my application I need to use tessellation control. All of the sources I have found online say tessellation starts with 4.0, but also say 3.3 is virtually the same.
Does OpenGL 3.3 have tessellation or not? Should I even bother with 3.3 in 2021 or should I just use the latest version 4.6? Is 4.6 compatible with mac?
Tessellation shaders are in the standard since OpenGL 4.0. However you can test if you hardware supports the ARB_tessellation_shader extension, which is written against the OpenGL 3.2.
I have an Intel HD4400 graphics card in my laptop. When I run glxinfo | grep "OpenGL", I get
OpenGL vendor string: Intel Open Source Technology Center
OpenGL renderer string: Mesa DRI Intel(R) Haswell Mobile
OpenGL core profile version string: 4.5 (Core Profile) Mesa 18.0.0-rc5
OpenGL core profile shading language version string: 4.50
OpenGL core profile context flags: (none)
OpenGL core profile profile mask: core profile
OpenGL core profile extensions:
OpenGL version string: 3.0 Mesa 18.0.0-rc5
OpenGL shading language version string: 1.30
OpenGL context flags: (none)
OpenGL extensions:
OpenGL ES profile version string: OpenGL ES 3.1 Mesa 18.0.0-rc5
OpenGL ES profile shading language version string: OpenGL ES GLSL ES 3.10
OpenGL ES profile extensions:
This should mean that my graphics card can support GLSL version 4.30, right? However, my shader compilation fails with the following message
error: GLSL 4.30 is not supported. Supported versions are: 1.10, 1.20, 1.30, 1.00 ES, 3.00 ES, and 3.10 ES
In my code, I do set the context profile to core, with the following statement
SDL_GL_SetAttribute(SDL_GL_CONTEXT_PROFILE_MASK, SDL_GL_CONTEXT_PROFILE_CORE)
after which I set the context major and minor versions to 4 and 3, respectively.
Any ideas? I am on Ubuntu 18.04. I thought it might be the graphics drivers that are not up to date, but if they are not up to date, then would glxinfo still tell me that version 4.50 is supported? It also seems like Ubuntu has the latest Intel graphics drivers installed already, and I do not want to risk installing graphics drivers that might break my display.
Additional Info:
I ran this exact code on another machine, and it worked perfectly (and I am quite sure that that machine does not support GL 4.3 compatible). I therefore do not believe that it is the way I created my context. Here is the code where I set my profile context version:
void set_gl_attribs()
{
if (SDL_GL_SetAttribute(SDL_GL_CONTEXT_PROFILE_MASK, SDL_GL_CONTEXT_PROFILE_CORE))
{
cout << "Could not set core profile" << endl;
}
if (SDL_GL_SetAttribute(SDL_GL_CONTEXT_MAJOR_VERSION, 4))
{
cout << "Could not set GL major version" << endl;
}
if (SDL_GL_SetAttribute(SDL_GL_CONTEXT_MINOR_VERSION, 3))
{
cout << "Could not set GL minor version" << endl;
}
}
And this is where I call the code (before I call any other SDL or OpenGL functions):
SDL_Init(SDL_INIT_EVERYTHING);
window = SDL_CreateWindow("Sphere", 100, 100, width, height, SDL_WINDOW_OPENGL);
gl_context = SDL_GL_CreateContext(window);
//SDL_HideWindow(window);
set_gl_attribs(); // where I set the context, as shown above
SDL_GL_SetSwapInterval(1);
glewExperimental = GL_TRUE;
GLuint glew_return = glewInit();
The setup code is incorrect. You must choose a core profile context before creating the context, changing the profile after the context exists is too late and it will not work. Also, according to the SDL2 documentation, all calls to SDL_GL_SetAttribute must be made before the window is created via SDL_CreateWindow.
Move all calls to SDL_GL_SetAtrribute so they are before SDL_CreateWindow.
The reason that this incorrect code appears to work on other machines is because different drivers provide different versions of OpenGL depending on whether you ask for a compatibility or code profile context.
Mesa will only provide 3.0 for compatibility contexts,
macOS will only provide 2.1 for compatibility contexts,
NVidia and AMD drivers on Windows will provide the latest version they support.
I am using Linux and trying to learn OpenGL. I am refering to the website learnopengl.com and have compiled the first program available here.
I seem to compile the program without problem with he command
g++ -Wall -lGLEW -lglfw -lGL -lX11 -lpthread -lXrandr -lXi ./foo.cpp
But when I run the program it shows stuff from behind, instead of blank window, like this
Note however, that this window is not transparent, as in if I move the window, the background is same. But if I minimize and reopen the window, the background is newly created. This happens for both dedicated and integrated GPU.
What do I need to do to make it work properly?
Here is some info
System: Host: aditya-pc Kernel: 4.4.13-1-MANJARO x86_64 (64 bit gcc: 6.1.1)
Desktop: KDE Plasma 5.6.4 (Qt 5.6.0) Distro: Manjaro Linux
Machine: System: HP (portable) product: HP Notebook v: Type1ProductConfigId
Mobo: HP model: 8136 v: 31.36
Bios: Insyde v: F.1F date: 01/18/2016
CPU: Dual core Intel Core i5-6200U (-HT-MCP-) cache: 3072 KB
flags: (lm nx sse sse2 sse3 sse4_1 sse4_2 ssse3 vmx) bmips: 9603
clock speeds: max: 2800 MHz 1: 583 MHz 2: 510 MHz 3: 670 MHz 4: 683 MHz
Graphics: Card-1: Intel Skylake Integrated Graphics bus-ID: 00:02.0
Card-2: Advanced Micro Devices [AMD/ATI] Sun XT [Radeon HD 8670A/8670M/8690M / R5 M330]
bus-ID: 01:00.0
Display Server: X.Org 1.17.4 drivers: ati,radeon,intel
Resolution: 1920x1080#60.06hz
GLX Renderer: Mesa DRI Intel HD Graphics 520 (Skylake GT2)
GLX Version: 3.0 Mesa 11.2.2 Direct Rendering: Yes
The two graphics card's info
Amd card
$ DRI_PRIME=1 glxinfo|grep OpenGL
OpenGL vendor string: X.Org
OpenGL renderer string: Gallium 0.4 on AMD HAINAN (DRM 2.43.0, LLVM 3.8.0)
OpenGL core profile version string: 4.1 (Core Profile) Mesa 11.2.2
OpenGL core profile shading language version string: 4.10
OpenGL core profile context flags: (none)
OpenGL core profile profile mask: core profile
OpenGL core profile extensions:
OpenGL version string: 3.0 Mesa 11.2.2
OpenGL shading language version string: 1.30
OpenGL context flags: (none)
OpenGL extensions:
OpenGL ES profile version string: OpenGL ES 3.0 Mesa 11.2.2
OpenGL ES profile shading language version string: OpenGL ES GLSL ES 3.00
OpenGL ES profile extensions:
Intel integrated
$ DRI_PRIME=0 glxinfo|grep OpenGL
OpenGL vendor string: Intel Open Source Technology Center
OpenGL renderer string: Mesa DRI Intel(R) HD Graphics 520 (Skylake GT2)
OpenGL core profile version string: 3.3 (Core Profile) Mesa 11.2.2
OpenGL core profile shading language version string: 3.30
OpenGL core profile context flags: (none)
OpenGL core profile profile mask: core profile
OpenGL core profile extensions:
OpenGL version string: 3.0 Mesa 11.2.2
OpenGL shading language version string: 1.30
OpenGL context flags: (none)
OpenGL extensions:
OpenGL ES profile version string: OpenGL ES 3.1 Mesa 11.2.2
OpenGL ES profile shading language version string: OpenGL ES GLSL ES 3.10
OpenGL ES profile extensions:
Technically what you experience is undefined behaviour. So here's what's happening: Your program asks the graphics system to create a new window and because this window is going to be used with OpenGL the library you're using (GLFW) is asking the graphics system to not clear the window on its own (the reason for this is that when playing animations the graphics system clearing the window between frame draws will cause flicker, which is unwanted).
So that new window is created and it needs graphics memory for doing that. These days there are two ways at presenting a window to the user. One is off-screen rendering with on-screen composition. The other is on-screen framebuffer windowing with pixel ownership testing. In your case you're getting the later of the two. Which means your window is essentially just a rectangle in the main screen framebuffer and when it gets created new, whatever was there before will remain in the window until it gets drawn with the desired contents.
Of course this "pixels left behind" behaviour is specified nowhere, it's an artifact of how the most reasonable implementation behaves (it behaves the same on Microsoft Windows with Areo disabled).
What is specified is, that when a window gets moved around its contents are moved along with it. Graphics hardware, even as old as from the late 1980-ies has special circuitry built in to do this efficiently. But this works only for as long as the window stays within the screen framebuffer. If you drag the window partially outside of the screen area and back again you'll see the border pixels getting replicated along the edges.
So what do you have to do to get a defined result? Well, you'll have to actually draw something (might be as simple as clearing it) to the window – and if it's a double buffered one swap the buffers.
I am trying to run the GS demo code of Mesa from here :
http://cgit.freedesktop.org/mesa/demos/tree/src/gs, by git cloning this:
http://cgit.freedesktop.org/mesa/demos
However, I get the error as "needs GL_ARB_geometry_shader4 extension".
I am not that up to date with how the Mesa development is going on, but it
seems that GL_ARB_geometry_shader4 extension is not available for Mesa?
As per this link:
http://lists.freedesktop.org/archives/mesa-dev/2014-August/065692.html, it
shows Geometry Shader support has been added to Intel SandyBridge platform.
I also came across this link:
http://dri.freedesktop.org/wiki/MissingFunctionality/, which indicates that
the GL_ARB_geometry_shader4 extension is a "Missing Functionality".
Considering all of this, how should I proceed to write my applications with
geometry shaders using Mesa?
I believe this extension is only supported on Nvidia GPU's which is why you can't use it
Edit : You don't need this extension to use Geometry Shaders. The example here
http://ogldev.atspace.co.uk/www/tutorial27/tutorial27.html
Should work perfectly fine on intel GPU's
I'm learning GLSL to do some computer graphics experiment now. While I tried to compile shaders from some tutorials, I ran into some problems. Here it was:
GL version: 3.0.0 - Build 8.15.10.2291
Error compiling shader type 35633: 'ERROR: 0:1: '' : Version number not support
ed by OGL driver
The video card of my laptop is GT520M, but I've already updated my video card driver to the version340.82,which is said to support OpenGL 4.5 and GLSL 4.50.