I have an issue with a webgl shader that I've determined is related to ANGLE because it only occurs on Windows in Firefox or Chrome, and it doesn't happen if I force opengl (chrome --use-gl=desktop).
I've created a jsfiddle that shows ANGLE-generated HLSL of my custom shader. (for hlsl conversion to work in this jsfiddle, you must run chrome with --enable-privileged-webgl-extensions, or just see my gist of the output)
So I have working glsl and the generated hlsl compiles but doesn't do the same thing. The symptom is that on Windows, the vertices appear in correct initial locations, but do not move although I change the uniform jed. I can't find the bug in the generated code.
Any tips for debugging problems like this?
Hard to say based on your info (including no original GLSL). It's not hard to imagine this being fixed by the July 4 revisions to ANGLE, however. I would say update, first.
Related
I've been checking out the cool animations on GLSL Sandbox but one of the demo isn't running for me, this one. The error isn't in compilation though, but at runtime - it says that it Failed to create D3D shaders. I looked into the sandbox source and I can see that it directly passes on the exception from WebGL, so it's not a problem with the website but with the code.
I walked through the code but it's rather long and I see no mention of D3D shaders. It looks much like the other demos which are working fine.
So...
What part of the code causes the problem? Could I comment something out to get it to work?
Is the problem browser dependent? Or does it depend on my OpenGL version?
Whenever I try to render my terrain with point light's it only works on my Nvidia gpu and driver, and not the Intel integrated and driver. I believe the problem is in my code and a bug in the Nvidia gpu since I heard Nvidia's OpenGL implementations are buggy and will let you get away with things your not supposed to. And since I get no error's I need help debugging my shader's.
Link:
http://pastebin.com/sgwBatnw
Note:
I use OpenGL 2 and GLSL Version 120
Edit:
I was able to fix the problem on my one, to anyone with similar problems it's not because I used the regular transformation matrix because when I did that I set the normals w value to 0.0; The problem was that with the intel integrated graphics there is apparently a max number of array's in a uniform or max uniform size in general and I was going over that limit but it was deciding not to report it. Another thing wrong with this code was that I was doing implicit type conversion (dividing vec3's by floats) so I corrected those things and it started to work. Here's my updated code.
Link: http://pastebin.com/zydK7YMh
I'm trying to tesselate a simple triangle using the Golang OpenGL bindings
The library doesn't claim support for the tesselation shaders, but I looked through the source code, and adding the correct bindings didn't seem terribly tricky. So I branched it and tried adding the correct constants in gl_defs.go.
The bindings still compile just fine and so does my program, it's when I actually try to use the new bindings that things go strange. The program goes from displaying a nicely circling triangle to a black screen whenever I actually try to include the tesselation shaders.
I'm following along with the OpenGL Superbible (6th edition) and using their shaders for this project, so I don't image I'm using broken shaders (they don't spit out an error log, anyway). But in case the shaders themselves could be at fault, they can be found in the setupProgram() function here.
I'm pretty sure my graphics card supports tesselation because printing the openGL version returns 4.4.0 NVIDIA 331.38
.
So my questions:
Is there any reason adding go bindings for tesselation wouldn't work? The bindings seem quite straightforward.
Am I adding the new bindings incorrectly?
If it should work, why is it not working for me?
What am I doing wrong here?
Steps that might be worth taking:
Your driver and video card may support tessellation shaders, but the GL context that your binding is returning for you might be for an earlier version of OpenGL. Try glGetString​(GL_VERSION​) and see what you get.
Are you calling glGetError basically everywhere and actually checking its values? Does this binding provide error return values? If so, are you checking those?
I'm working on an OpenGL project on Windows, using GLEW to provide the functionality the provided Windows headers lack. For shader support, I'm using NVIDIA's Cg. All the documentation and code samples I have read indicate that the following is the correct method for loading an using shaders, and I've implemented things this way in my code:
Create a Cg context with cgCreateContext.
Get the latest vertex and pixel shader profiles using cgGLGetLatestProfile with CG_GL_VERTEX and CG_GL_FRAGMENT, respectively. Use cgGLSetContextOptimalOptions to create the optimal setup for both profiles.
Using these profiles and shaders you have written, create shader programs using cgCreateProgramFromFile.
Load the shader programs using cgGLLoadProgram.
Then, each frame, for an object that uses a given shader:
Bind the desired shader(s) (vertex and/or pixel) using cgGLBindProgram.
Enable the profile(s) for the desired shader(s) using cgGLEnableProfile.
Retrieve and set any needed uniform shader parameters using cgGetNamedParameter and the various parameter setting functions.
Render your object normally
Clean up the shader by calling cgGLDisableProfile
However, things start getting strange. When using a single shader everything works just fine, but the act of loading a second shader with cgGLLoadProgram seems to make objects using the first one cease to render. Switching the draw order seems to resolve the issue, but that's hardly a fix. This problem occurs on both my and my partner's laptops (fairly recent machines with Intel integrated chipsets).
I tested the same code on my desktop with a GeForce GTX 260, and everything worked fine. I would just write this off as my laptop GPU not getting along with Cg, but I've successfully built and run programs that use several Cg shaders simultaneously on my laptop using the OGRE graphics engine (unfortunately the assignment I'm currently working on is for a computer graphics class, so I can't just use OGRE).
In conclusion, I'm stumped. What is OGRE doing that my code is not? Am I using Cg improperly?
You have to call cgGLEnableProfile before you call cgGLBindProgram. From your question it appears you do it the other way around.
From the Cg documentation for cgGLBindProgram:
cgGLBindProgram binds a program to the current state. The program must have been loaded with cgGLLoadProgram before it can be bound. Also, the profile of the program must be enabled for the binding to work. This may be done with the cgGLEnableProfile function.
I have trouble developing an OpenGL application.
The weird thing is that me and a friend of mine are developing a 3d scene with OpenGL under Linux, and there is some code on the repository, but if we both checkout the same latest version, that means, the SAME code this happens: On his computer after he compiles he can see the full lighting model, whilst on mine, I have only the ambient lights activated but not the diffuse or specular ones.
Can it be a problem of drivers ?(since he uses an ATi card and I use an nVIDIA one)
Or the static libraries ?
I repeat, it is the same code, compiled in different machines.. that's the strange thing, it should look the same.
Thanks for any help or tip given.
This can very easily be a driver problem, or one card supporting extensions that the other does not.
Try his binaries on your machine. If it continues to fail, either your drivers are whack or you're using a command not supported by your card. On the other hand if your screen looks right when using your code compiled on his machine, then your static libraries have a problem.