I recently switched my laptop's OS from Windows 10 to Fedora linux. After this I tried to load and run my current c++ SFML project. However, when it attempts to load my geometry shader I just get this:
Failed to create a shader: your system doesn't support geometry shaders (you should test Shader::isGeometryAvailable() before trying to use geometry shaders)
I know my system should support geometry shaders as it worked just fine previously on windows. My laptop has a Quad Core AMD Ryzen 5 2500U with Radeon Vega Mobile Gfx chip with the amdgpu driver. Is this an issue with drivers? Could my software have caused this issue? (If so let me know and I will edit this post)
The result of sf::shader::isAvailable() is true. The result of sf::shader:isGeometryAvailable() is false.
If anyone knows how to fix this issue it would be excellent.
Related
I ported a DirectX 11 application to UWP to deploy it on Xbox Series X|S and hardware tessellation shaders are not working when running the app on Xbox (tested on retail Xbox Series X and Series S in devmode). The rendered geometry doesn't show up in the viewport but no errors are thrown. Running the same app locally on my PC renders the tessellated geometry without any issues. After reading through this blogpost and the follow-up, I made sure my application is running in game mode and a DirectX 11 Feature Level 11.0 context is created (creating a 10.1 context procudes errors when trying to use tessellation primtives as expected). Rendering statistics from the app suggest that there are vertex shader invokes but no hull shader invokes afterwards and hull shader invokes, but not domain shader invokes afterwards.
Because I assumed there to be some sort of subtle bug in the tessellation implementation of my app, I next tried the SimpleBezierUWP sample app from Microsoft. The result is the same: On PC, it renders just fine but when running on Xbox, the geometry is missing. This applies to both the DX11 and DX12 version of that sample app.
To recreate this bug just download the SimpleBezierUWP sample, build it and deploy it to a retail Xbox Series X or S in devmode.
So has anyone successfully used tessellation shaders in an UWP application on Xbox Series X or S? Is it not supported after all, even if Direct X Feature Level 11.0 is? Or are there special requirements for writing hull and domain shaders for this specific hardware that I was not able to find out about from publicly available source?
Thanks!
Yesterday my OpenGL Classes have started, and me and my classmates downloaded a project made by my teacher as a sample project to learn from, on their pc it worked perfectly but all I'm seeing is white, the objects are white and the ground is white, on other classmates their pc's the objects are rendered correctly.
Does anyone know why it won't render the textures on my pc?
I've tried a few things such as, turning my Intel Graphics card off and running it on my Nvidia card, and i've tried to turn my Nvidia card off to run it on my Intel but neither worked.
My Nvidia card is a Nvidia Geforce 710M which supports OpenGL 4.5, and I have OpenGL version 4.5.
I'm not getting any errors in the code.
I'am able to see shadows I've just tried a different project and only saw shadows, the rest was all white, as it was with the first project.
I've re-installed the required libraries again, then rebooted my pc and it worked :D thank you for all your help.
I'm developing a 3D stereoscopic OpenGL app specifically for Windows 7 and nVidia Quadro K5000 cards. Rendering the scene from left and right-eye perspectives using glDrawBuffer(GL_BACK_LEFT) and glDrawBuffer(GL_BACK_RIGHT) works fine, and the 3D effect is displayed nicely.
While this works, I'd like to use nVidia's nSight Graphics local debugging. However, I get the error "Cannot enter frame debugging. nSight only supports frame debugging for ... OpenGL 4.2. Reason: glDrawBuffer(bufs[i] = 0x00000402)"
If the calls to glDrawBuffer are removed, nSight local debugging works.
Going through the OpenGL 4.2 spec, DrawBuffer is described in section 4.2.1
So, two questions:
1) Is there some other way (besides DrawBuffer) to specify BACK_RIGHT or BACK_LEFT buffers for drawing to quad-buffers?
2) Is nSight capable of doing frame-level debugging on quad-buffered stereoscopic setups? If so, how?
I am having trouble to get valid values in my fragment-shader's gl_PointCoord variable. I use libgdx, which is a cross-platform java framework that allows to run the same application on the desktop as well on android. The shader works fine with OpenGL ES on android, only the desktop seems to not provide a correctly interpolated value but always zero.
Could this be an issue with libgdx or with the graphics driver?
NVidia Quadro 3000M
Driver 275.33
Win 7 64-bit (Service Pack 1)
libgdx-0.9.6
fyi: Did not do much research yet, but seems to be a bug in lwjgl or the driver that gl_PointCoord is only available when enabling pointsprite-mode via
Gdx.gl20.glEnable(GL11.GL_POINT_SPRITE_OES);
This is not available in OpenGL 4.2 or OpenGL ES 2.0, but seems to be required to be explicitly set on the desktop.
I compiled Joe Groff's "An intro to modern OpenGL: Hello World: The Slideshow.
I have compiled it using Mingw-w64 with freeglut, Glut 3.7 and a version that makes my own context.
However, when I run the program, the image doesn't fade back and forth like its supposed to and I can't figure out why (spent a whole day on it).
Also, I have examined most of inputs and outputs except for the shaders and cant find anything wrong, anyone have any ideas?
Most likely, your OpenGL version doesn't support shaders. Are you by any chance running in a virtual machine or via remote desktop? These tend to only support OpenGL 1.1 even if the graphics card/drivers are much more recent, and OpenGL 1.1 does not support shaders. It's also possible that if you're using an older laptop with an integrated Intel GPU that shaders are not (properly) supported.