Failed to create D3D shaders - webGL GLSL - glsl

I've been checking out the cool animations on GLSL Sandbox but one of the demo isn't running for me, this one. The error isn't in compilation though, but at runtime - it says that it Failed to create D3D shaders. I looked into the sandbox source and I can see that it directly passes on the exception from WebGL, so it's not a problem with the website but with the code.
I walked through the code but it's rather long and I see no mention of D3D shaders. It looks much like the other demos which are working fine.
So...
What part of the code causes the problem? Could I comment something out to get it to work?
Is the problem browser dependent? Or does it depend on my OpenGL version?

Related

Having trouble adding more Texture2D's as Pixel Shader resources in DirectX 11

I'm having a really annoying problem with my application that is a school project. I'm using deferred rendering and I'm trying to add positions from the light's pov as a new g-buffer texture, and the depth buffer texture as a shader resource in the light pass. I handle all of the g-buffer textures in the exact same way.
My problem is that these new shader resources are nowhere to be found on the GPU!
I'm using RenderDoc to debug my application, and there I can see everything being written to these new resources just fine, and the call to bind them as shader resources looks good as well, but I still only have the 4 resources in the light pass that I had before.
My code is an absolute mess, and there's a lot(!) of it. So if you want to see something specific to be able to help me, I can post it.
I'd be really happy if I just got some tips as to how you go about debugging this kind of problem, and even happier if someone knows what the problem is.
Thanks in advance!
There are two essential first steps for debugging DirectX applications:
Make sure that for every function that returns an HRESULT that you check it at runtime. If it is safe to ignore the return, it would return void. When checking the HRESULT don't use == S_OK, but use the FAILED or SUCCEEDED macro instead. A nice option for 'fast fail' errors is to use C++ exception handling via ThrowIfFailed.
Enable the DirectX Debug Device and look for errors, warnings, and other messages in the debug output window. For details on enabling it, see Anatomy of Direct3D 11 Create Device and Direct3D SDK Debug Layer Tricks.
This may not solve your problem, but can really help avoid some silly mistakes that cost you a lot of time to track down.
If you are new to DirectX development, take a look at the DirectX Tool Kit tutorials
For debugging DirectX 11 applications, the Visual Studio Graphics diagnostics feature in VS 2012 or later is a good option. You likely fit the license requirements to use the VS 2013 or VS 2015 Community edition, and it includes all features including the VSGS. See MSDN.

Modern opengl rendering pipeline

Okay i have been studying opengl online, however most tutorials i have been seeing only cover the fixed pipeline. I am trying to add it into an object oriented project, however i am not quiet sure the modern process with shaders and such. Is the process as easy as binding a buffer, as well as a shader? And what exactly are handles used for? I have added glew and glfw, even though now my log is saying glew failed to initialize, error 1282, thats a whole different topic, unless glew and glfw are incompatable. Can anyone shine a light on this subject?
The handles in opengl are just GLInts, which for example could be used to work with a VBO, VAO, stuff like that.
As for the shader, it uses the glsl shading language. Then they give the functions to compile and link the shader to your opengl context.
Asking how shaders, handles, and setting up the environment work for opengl is a very broad question, you would be better off following a tutorial. A good one would be OpenglDev which covers all the basic concepts, as well as some advanced ones. It's not opengl-es, but if you understand those tutorials opengl-es should be no problem transitioning to. The Visual Studio solution project is available for download Here, which will come with the project already setup with the required libraries.

Is There A Way I Can Debug An GLSL Shader?

Is there a way i can debug a glsl shader? including like breakpoints and data tracking
i seen simple ones that let me see what shaders make my shade programs but nothing i can put break points in.
I need to check out values of matrices and just throwing a glFragColor will not work since there's so many values to be compared and checked.
is there a simple way of doing this? besides me just writing down what values i think i might have and doing my math out else where.
it's really annoying when I'm trying to understand all of OpenGL and knowing how to navigate around DirectX. I can see why some people get scared away from OpenGL when resources get hard to find.
According to the development updates for NVIDIA Nsight, they recently added features for GLSL debugging (https://developer.nvidia.com/nsight-visual-studio-edition-3_0-new-features). I would look there first. Also glslDevil (http://www.vis.uni-stuttgart.de/glsldevil/index.html) looks good. I haven't tried either program myself, so can't give first hand experience about quality. I have been impressed by NVIDIA's support for debugging in CUDA though, so have high expectations.

Tesselation in Go-GL

I'm trying to tesselate a simple triangle using the Golang OpenGL bindings
The library doesn't claim support for the tesselation shaders, but I looked through the source code, and adding the correct bindings didn't seem terribly tricky. So I branched it and tried adding the correct constants in gl_defs.go.
The bindings still compile just fine and so does my program, it's when I actually try to use the new bindings that things go strange. The program goes from displaying a nicely circling triangle to a black screen whenever I actually try to include the tesselation shaders.
I'm following along with the OpenGL Superbible (6th edition) and using their shaders for this project, so I don't image I'm using broken shaders (they don't spit out an error log, anyway). But in case the shaders themselves could be at fault, they can be found in the setupProgram() function here.
I'm pretty sure my graphics card supports tesselation because printing the openGL version returns 4.4.0 NVIDIA 331.38
.
So my questions:
Is there any reason adding go bindings for tesselation wouldn't work? The bindings seem quite straightforward.
Am I adding the new bindings incorrectly?
If it should work, why is it not working for me?
What am I doing wrong here?
Steps that might be worth taking:
Your driver and video card may support tessellation shaders, but the GL context that your binding is returning for you might be for an earlier version of OpenGL. Try glGetString​(GL_VERSION​) and see what you get.
Are you calling glGetError basically everywhere and actually checking its values? Does this binding provide error return values? If so, are you checking those?

Broken ANGLE-generated HLSL from webgl shader

I have an issue with a webgl shader that I've determined is related to ANGLE because it only occurs on Windows in Firefox or Chrome, and it doesn't happen if I force opengl (chrome --use-gl=desktop).
I've created a jsfiddle that shows ANGLE-generated HLSL of my custom shader. (for hlsl conversion to work in this jsfiddle, you must run chrome with --enable-privileged-webgl-extensions, or just see my gist of the output)
So I have working glsl and the generated hlsl compiles but doesn't do the same thing. The symptom is that on Windows, the vertices appear in correct initial locations, but do not move although I change the uniform jed. I can't find the bug in the generated code.
Any tips for debugging problems like this?
Hard to say based on your info (including no original GLSL). It's not hard to imagine this being fixed by the July 4 revisions to ANGLE, however. I would say update, first.