Mixing fixed function pipeline and programmable pipeline in opengl - c++

Okay, so here is my problem. I have a framework used by my school for visualizations, and I've been trying to set it up to do 3d graphics. Problem is, the framework currently only uses the fixed function pipeline to draw. Without messing that up, i've been trying to work around the old code which still needs to use the fixed function pipeline, and i have been setting up facilities to allow for the creation of shaders and shader programs. I've got a simple color shader to compile and i've also made a test vertex array (a green triangle).
Now when i tried to render it, the screen went black. Before hand, there was a lot of 2d sprites and what not moving about the screen, but stepping through the code i added to the render function, I found that the screen goes black the moment I call glUseProgram. If i comment out the glUseProgram, and the parts where i set the uniforms and draw, everything works normally. Does glUseProgram disable the fixed function pipeline? if so, is there anyway to reactivate it, per se?

The moment you use glUseProgram fixed function pipeline is replaced by programmable pipeline. You can't have like fixed function + programmable pipeline at the same time. For example suppose your scene contains fog. But if you haven't taken care of that in your fragment shader you wont see it in final output.
Though in your render/draw function you can do something like this
draw
{
glUseProgram(program);
// render stuff with shader
glUseProgram(0)
// render stuff with fixed pipeline
}

Related

Opengl glsl multiple shaders freeze screen

I'm writing a little OpenGl program with Glsl.
Now I have two objects which I need to draw. Both have different shaders.
Normally I think I should do something like that in my draw() method:
void draw() {
shaderObjektOne.bind();
glBegin(xxx);
//draw Object one
...
glEnd()
shaderObjektTwo.bind();
glBegin(xxx);
//draw Object two
...
glEnd()
}
If I do that this way my screen freezes.
Binding the shader for only one object is working without problems.
I've been looking around but I couldn't find a real explanation why this error occurs.
Is it because a render target can only be rendered with one shader?
How can I avoid a giant shader file or having multiple render targets?
Thanks.
EDIT:
I want to have separate compiled shader programs for each object. These will be bound right before I draw the vertexes for the object. I want to avoid one big shader in which I need to set specific parameters to choose the functionality for the object. I use glut and currently all drawing is done before glutSwapBuffers().
'Freezing' means that there is actual something visible on the screen (the last object I've drawn with the last bound shader) but my input isn't working anymore. That means, I cannot move the camera in the world but the program is still running normal (tested with a debugger).
Got it.
It was a problem with my program design. I accidentally added a copy of the shader which I wanted to bind.
Everytime I tried to bind the shader it bound the copy of it.
Thanks for your help guys :)

Can you use glColor3f() whilst bypassing the fragment shader? [duplicate]

Okay, so here is my problem. I have a framework used by my school for visualizations, and I've been trying to set it up to do 3d graphics. Problem is, the framework currently only uses the fixed function pipeline to draw. Without messing that up, i've been trying to work around the old code which still needs to use the fixed function pipeline, and i have been setting up facilities to allow for the creation of shaders and shader programs. I've got a simple color shader to compile and i've also made a test vertex array (a green triangle).
Now when i tried to render it, the screen went black. Before hand, there was a lot of 2d sprites and what not moving about the screen, but stepping through the code i added to the render function, I found that the screen goes black the moment I call glUseProgram. If i comment out the glUseProgram, and the parts where i set the uniforms and draw, everything works normally. Does glUseProgram disable the fixed function pipeline? if so, is there anyway to reactivate it, per se?
The moment you use glUseProgram fixed function pipeline is replaced by programmable pipeline. You can't have like fixed function + programmable pipeline at the same time. For example suppose your scene contains fog. But if you haven't taken care of that in your fragment shader you wont see it in final output.
Though in your render/draw function you can do something like this
draw
{
glUseProgram(program);
// render stuff with shader
glUseProgram(0)
// render stuff with fixed pipeline
}

How do you use GLSL shaders in an OGRE application without using an material script?

I'm developing an application for a virtual reality environment using OGRE, Bullet and Equalizer. My rendering function looks like this:
root->_fireFrameStarted();
root->_fireFrameRenderingQueued();
root->_fireFrameEnded();
_window->update(false);
The window does not do the buffer swap, because Equalizer does that. This function works fine, we can even use particle systems and all the other fancy stuff OGRE offers.
However, since the projection area in our lab is curved, we have a GLSL module (let's call it Warp) we use in all our applications to distort the rendering output so that it fits our projection wall. We accomplish this by creating a texture, copying the contents of the back buffer to it and applying our warping shader when rendering the distorted texture covering the entire window.
You can find the source code here: pastebin . com/ TjNJuHtC
We call this module in eq::Window::frameDrawFinish() and it works well with pure OpenGL applications but not with OGRE.
This is the output without the module and its shader:
http://s1.directupload.net/images/130620/amr5qh3x.png
If I enable the module, this is the rather strange output:
http://s14.directupload.net/images/130620/74qiwgyd.png
And if I disable the two calls to glBindTexture, the texture used by the sun particle effect (flare.png) is rendered onto the screen (I would show it to you but I can only use two links).
Every GL state variable I consider relevant have the same values in our other applications.
I checked GL_READ_BUFFER, GL_DRAW_BUFFER, GL_RENDER_MODE, GL_CULL_FACE_MODE and GL_DOUBLEBUFFER.
This raises some questions: Why does my call to glTexCopySubImage2D() seem to have no effect at all? And why does my shader not do anything even if I tell it to just make every fragment red?
Supplemental: Putting the entire shader into a material script and letting OGRE handle it is not an option.
Solved the problem: I created my shaders before Ogre::Root created my windows. I have changed the order and now it works.

OpenGL3 two sets of shaders, texture showing black

I've recently succeeded at making a small test app with a GL_TEXTURE_RECTANGLE. Now I'm trying to integrate it into my larger project, but when I call glBindTexture(GL_TEXTURE_RECTANGLE, _tex_id[0]) inside the render function, it's causing the GL_INVALID_OPERATION​ error. The texture image sometimes shows for a fraction of a second, then turns black and stays black.
I am trying to do this by using two sets of vertex and fragment shaders, one set for the 3D scene, and one set for the 2D overlay, but I've never tried this before so I don't know if that's what's causing the error, or if I should be going about this a different way. The shaders are all compiling and linking fine.
Any insight would be much appreciated, and if it would help to see some code, let me know and I'll post some of it (although I think it may be too much for anyone to reasonably look through).
Edit: gDEBugger breaks at the call to glBindTexture(), and when clicking on the breakpoint, the properties window shows a picture of one of my other textures (one that's being loaded by the 3D scene's shaders), it shows that it's trying to load texture number 1, but I know this number is already being used to draw the same 3D scene's texture shown in the properties window... why would glGenTextures() be giving me overlapping texture id numbers? Is this normal or maybe part of the problem?
The black texture was due to me not forwarding some vertex shader inputs (normals) through to the fragment shader, even though I'm not using normals for anything in the 2D overlay shaders. As soon as I added outputs for all the inputs, and forwarded them along to the fragment shader, the texture was no longer black, but it was still disappearing after a fraction of a second. This was because I was calling glBindTexture(GL_TEXTURE_RECTANGLE, 0) at the end of the render function with the hopes that it would clean up some state... this was clearly the wrong thing to do, because removing that call caused the 2D texture to stay on-screen. Furthermore, calling glBindTexture() with the GL_TEXTURE_RECTANGLE target seems to work during the texture setup stage, but during rendering the GL_TEXTURE_RECTANGLE target was causing the GL_INVALID_OPERATION​ error. Changing the target to GL_TEXTURE_2D only in the render function made the error go away, and everything seems to work nicely now.

DirectX post-processing shader

I have a simple application in which I need to let the user select a shader (.fx HLSL or assembly file, possibly with multiple passes, but all and only pixel shader) and preview it.
The application runs, the list of shaders comes up, and a button launches the "preview window."
From this preview window (which has a DirectX viewport in it), the user selects an image and the shader is run on that image and displayed. Only one frame needs rendered (not real-time).
I have a vertex/pixel shader combination set up that takes a quad and renders it to the screen, textured with the chosen image. This works perfectly.
I need to then run another effect, purely pixel shader, on the output from the first effect, and display the final image (post-processed) to the screen. This doesn't work at all.
I've tried for the past few days to get it working, but for no apparent reason, the identical code blocks used to render each effect only render the first. I can add the second shader file as a second pass in the first shader file and it runs perfectly (although completely defeats my goal of previewing user-created shaders). When I try to use a second effect (which loads and compiles just fine), it does nothing.
I've taken the results of the first shader (with GetRenderTargetData) and placed them in a texture & surface (destTex and destSur), then set that texture as the input for the second pass (using dev->SetTexture and later effect->SetTexture("thisframe", destTex)).
All calls succeed, effects compile, textures load, quads are drawn, but the effect is not visible.
I suspected at first the device (created with software vertex processing) was causing the issue, but that doesn't seem to be the case (I tried with hardware and mixed).
Additionally, using both a HAL and REF device (not a problem, since the app isn't realtime anyways), that second shader isn't visible.
Everything is written in C++ for Direct3D 9.
Try clearing the depth-stencil buffer after each time you render the quad.
First Create a texture, then render the first shader directly into that texture. Finally render the second shader with the texture as input to the Backbuffer.
There must be some kind of vertex input and vertex processing (either fixed-function or shader) in order for the pixel shader to be run. Are you supplying the vertex shader, and if so are you sure it does what the pixel shader expects? What does your draw call look like?
It's probably worth looking at a PIX trace of your app to see what the device state is when trying to use the user effect.