I've been struggling with setting up opengl in ubuntu.
My glxinfo:
OpenGL vendor string: Intel Open Source Technology Center
OpenGL renderer string: Mesa DRI Intel(R) Ironlake Mobile
OpenGL version string: 2.1 Mesa 9.1.3
OpenGL shading language version string: 1.20
OpenGL extensions:
I have a nvidia GT 520m card, how can I use opengl library by nvidia? I installed nvidia-current and nvidia-current-dev, but nothing happens. I need a newer version of opengl but it seems my intel integrated card only supports opengl 2.1.
My laptop:
Acer 4743g; 2g ddr3 memory; gt 520m graphics card.
System Settings > Software & Updates > Additional Drivers tab
Select the NVIDIA one and reboot
Related
As part of my studies, I have to be able to compile and run OpenGL 4.5-based programs, and I only have a Mac to do so. I am using Ubuntu in a VM to have a more programming-friendly environment, but my software and hardware (macOS Sierra 10.12.6, Intel HD Graphics 4000) is limited to 3.3 contexts. Here is additional info (ran from the Ubuntu VM) :
$ glxinfo | grep 'OpenGL core'
OpenGL core profile version string: 3.3 (Core Profile) Mesa 18.0.5
OpenGL core profile shading language version string: 3.30
OpenGL core profile context flags: (none)
OpenGL core profile profile mask: core profile
OpenGL core profile extensions:
The assignments use GLFW. I've been trying to get it to use Mesa (which it does, with llvmpipe) with software rendering so that it is independent from the hardware capabilities (if I understand correctly), but the window creation always fails when I try to create a 4.5 context. I looked around and tried using export LIBGL_ALWAYS_SOFTWARE=1 to no avail. The build chain uses CMAKE to build both the assignments and the actual GLFW lib, for what it's worth (I noticed that it uses X11 with the xorg-dev packages).
Does Mesa implement any version of OpenGL 4 ? If it does, can I force software rendering on GLFW's part so that I can create an OpenGL 4.x context ?
Does Mesa implement any version of OpenGL 4 ?
Their Intel & Radeon hardware drivers do.
... can I force software rendering on GLFW's part so that I can create an OpenGL 4.x context ?
Nope, all of Mesa's software renderers (softpipe, llvmpipe, & swr) top out at OpenGL 3.3.
As of Mesa 20.2 llvmpipe supports OpenGL 4.5.
I have this laptop:
http://shop.lenovo.com/us/en/laptops/thinkpad/11e-series/11e-3rd-gen-intel/?menu-id=thinkpad_11e_3rd_gen_windows
where I have installed Ubuntu. I am trying to determine what version of opengl is supported on the laptop so I run:
glxinfo|more
which gives:
Extended renderer info (GLX_MESA_query_renderer):
Vendor: Intel Open Source Technology Center (0x8086)
Device: Mesa DRI Intel(R) HD Graphics 520 (Skylake GT2) (0x1916)
Version: 12.0.3
Accelerated: yes
Video memory: 3072MB
Unified memory: yes
Preferred profile: core (0x1)
Max core profile version: 4.3
Max compat profile version: 3.0
Max GLES1 profile version: 1.1
Max GLES[23] profile version: 3.1
OpenGL vendor string: Intel Open Source Technology Center
OpenGL renderer string: Mesa DRI Intel(R) HD Graphics 520 (Skylake GT2)
OpenGL core profile version string: 4.3 (Core Profile) Mesa 12.0.3
OpenGL core profile shading language version string: 4.30
OpenGL core profile context flags: (none)
OpenGL core profile profile mask: core profile
OpenGL core profile extensions:
GL_3DFX_texture_compression_FXT1, GL_AMD_conservative_depth,
GL_AMD_draw_buffers_blend, GL_AMD_seamless_cubemap_per_texture,
GL_AMD_shader_stencil_export, GL_AMD_shader_trinary_minmax,
From that it looks like the graphics card in the laptop supports opengl version:
Max core profile version: 4.3
But when I run:
glxinfo | grep "OpenGL version"
OpenGL version string: 3.0 Mesa 12.0.3
so maybe only version 3.0?
From this website:
https://learnopengl.com/#!Getting-started/OpenGL
seems some pretty significant architectural changes were introduced in version 3.3 so could be great if I can use that.
What of the above output tells me the correct version and can I use opengl 3.3 on this machine?
Seems I have 4.4:
Don't filter the output via grep, but read it all.
glxinfo reports separately:
the highest OpenGL Core Profile version available (in your case: 4.3)
the highest non-Core / Compatibility / < 3.2 OpenGL version available (in your case: 3.0)
the highest OpenGL ES 1 version available (1.1)
the highest OpenGL ES 2/3 version available (3.1)
The reason why Core and non-Core are reported separately is because drivers are allowed to not implement the Compatibility profile for OpenGL >= 3.2. That's precisely your case: Core gives you 4.3, non-Core only 3.0.
(Basically, OpenGL made a colossal and gigantic mess around the 3.0 and 3.1 versions. Nobody really talks about them. For mental simplicity, you can split the versioning between 3.2+ Core and pre-3.0. See also here).
Similarly, OpenGL ES 1 and 2/3 are not compatible between each other, so you need to query both to figure out what are the respective supported versions. (ES 2 and 3 are compatible between them, so they're in just one line).
Running the command glxinfo | grep OpenGL shows
OpenGL vendor string: VMware, Inc.
OpenGL renderer string: Gallium 0.4 on SVGA3D; build: RELEASE;
OpenGL version string: 2.1 Mesa 10.7.0-devel
OpenGL shading language version string: 1.20
OpenGL extensions:
OpenGL ES profile version string: OpenGL ES 2.0 Mesa 10.7.0-devel
OpenGL ES profile shading language version string: OpenGL ES GLSL ES 1.0.16
I have the Intel HD 4000 graphic card which can support OpenGL 3.3 according to the sources on internet.
Simply, what should I do so that glxinfo shows version 3.3 so that I may proceed to learn modern graphics programming?
You're running in a VM. GPUs usually are not passed through to the VM and all you get is a shim-driver supporting only a lower OpenGL version, which commands are passed through the VM to the host.
Solution: Run Linux natively on your box.
I'm having trouble running an opengl(3.3) program through ssh.
When I run:
glxinfo | grep -i opengl
on my own computer (ubuntu 12.04) I get:
OpenGL vendor string: NVIDIA Corporation
OpenGL renderer string: Quadro FX 580/PCIe/SSE2
OpenGL version string: 3.3.0 NVIDIA 304.116
OpenGL shading language version string: 3.30 NVIDIA via Cg compiler
OpenGL extensions:
when I ssh to the remote computer(ubuntu 10.04) and run the same command, I get:
OpenGL vendor string: NVIDIA Corporation
OpenGL renderer string: Quadro FX 580/PCIe/SSE2
OpenGL version string: 2.1.2 NVIDIA 304.116
OpenGL shading language version string: 1.20 NVIDIA via Cg compiler
OpenGL extensions:
For some reason I'm not getting the more up to date version of opengl(3.3) when sshing to the remote computer. Is there a workout to this problem that doesn't require admin privileges on the remote computer?
OpenGL over SSH means using an indirect rendering context which uses the GLX protocol to send OpenGL commands to the X server. The GLX protocol goes only up to OpenGL-2.1. There's no support for OpenGL-3 or later in GLX so far. Essentially you're SOL until someone finally gets around to specify/implement GLX3.
I have a late 2009 Macbook http://support.apple.com/kb/SP579
I upgraded the operating system everytime a new one came out, so I'm using 10.8 Mountain Lion now. I was trying to get a program to work when the developer suggested I check my Open GL version with glxinfo.
OpenGL vendor string: NVIDIA Corporation
OpenGL renderer string: NVIDIA GeForce 9400M OpenGL Engine
OpenGL version string: 2.1 NVIDIA-8.12.47 310.40.00.05f01
OpenGL shading language version string: 1.20
OpenGL extensions:
According to this https://developer.apple.com/graphicsimaging/opengl/capabilities/
I should have Open GL version 3.2, but I don't. I've also looked for newer Nvidia drivers, but the nvidia drivers download website doesn't show any for OS X 10.8 (Mountain Lion), GeForce 9400M. The Open GL website says that unlike other operating systems, Open GL is updated along with the other OS X updates.
What do? How do I get OpenGL to 3.2?
I check my Open GL version with glxinfo
glxinfo goes through the X11 server. The OpenGL support of the X11 server which is shipping with OS X is rather limited. You must check with a tool that queries the native OpenGL API of OS X, not the GLX emulation layer.