how to apply shader to specific object - c++

I have several objects on my scene. I want to apply my shader to one of them only. Environment: OpenGL 2.0, C++, GLUT, GLEW.

The shader program is only in effect for as long as it is installed. Only the draw calls you make while the program is installed will use the shader. You must install your shader, draw your object, and then uninstall the shader.
Edit: By "install" the shader I mean use glUseProgram with your shader's handle. By "uninstall" I mean either installing another shader or calling glUseProgram with an argument of 0. See glUseProgram. My "install/uninstall" terminology comes from there.

In your drawcall draw that object with that shader and draw the other ones without it.. can't really be any more simple than that ;P You could use enums in your object class where you can specify shaders that are enabled for that object and only pass them through that shader when they are supposed to.. of course if it's a fullscreen pixelshader then you're in trouble as it processes every pixel and renders a new image to display. Unless you have a way of passing the object as a parameter and a algoritm to only apply the alterations at the location of that object.

Related

Opengl glsl multiple shaders freeze screen

I'm writing a little OpenGl program with Glsl.
Now I have two objects which I need to draw. Both have different shaders.
Normally I think I should do something like that in my draw() method:
void draw() {
shaderObjektOne.bind();
glBegin(xxx);
//draw Object one
...
glEnd()
shaderObjektTwo.bind();
glBegin(xxx);
//draw Object two
...
glEnd()
}
If I do that this way my screen freezes.
Binding the shader for only one object is working without problems.
I've been looking around but I couldn't find a real explanation why this error occurs.
Is it because a render target can only be rendered with one shader?
How can I avoid a giant shader file or having multiple render targets?
Thanks.
EDIT:
I want to have separate compiled shader programs for each object. These will be bound right before I draw the vertexes for the object. I want to avoid one big shader in which I need to set specific parameters to choose the functionality for the object. I use glut and currently all drawing is done before glutSwapBuffers().
'Freezing' means that there is actual something visible on the screen (the last object I've drawn with the last bound shader) but my input isn't working anymore. That means, I cannot move the camera in the world but the program is still running normal (tested with a debugger).
Got it.
It was a problem with my program design. I accidentally added a copy of the shader which I wanted to bind.
Everytime I tried to bind the shader it bound the copy of it.
Thanks for your help guys :)

How to apply a vertex shader to all vertices in a scene in OpenGL?

I'm working on a small engine in OpenTK right now, and I've got shaders working so far. I wonder though , how it is possible to apply a shader to an entire scene!?. I've seen this done in minecraft for example, where someone created a shader that warped the entire scene. But since every object is rendered with its own shader active, how would I achieve this?
You seem to be referring to a technique called post processing. The way it works is that you first render the entire scene to a texture using the shaders you already have. You can then render this texture to the screen using a fragment shader to apply various effects like motion blur, warping or depth of field.
"But since every object is rendered with its own shader active"
That's not how OpenGL works. In fact there's no such thing as "models" (what you probably mean by "object") in OpenGL. OpenGL draws primitives (points, lines and triangles) one at a time. Furthermore there's no hard association between a set of primitives and the shaders being used.
It's trivial to just bind a single shader program at the beginning of a batch and every primitive of that batch is subjected to this shader. If the batch consists of the whole scene, then the whole scene uses that shader.
AFAIK, you can only bind one vertex shader at a time.
What you may want to try is to render to a texture first then rerender the texture onto the screen but applying some changes to it (warping it for example). You can also extract the depth buffer and use it if you have a more complex change that you want to apply.
If you bind the shader you want before the render loop, it would effect all items until you un-bind it (i.e. binding id #0) or disable GL_TEXTURE_2D via glEnable()/glDisable().

OpenGL transform feedback definition completely inside shader

I'm trying to get my transform feedback running. I wanted to specify my buffer layout completely from the shaders using the core 4.4 or the GL_ARB_enhanced_layouts extension using layout (xfb_offset=xx) declarers. I assumed that after declaring these in a vertex shader i can call
GLint iTransformFeedbackVars;
glGetProgramiv(m_uProgramID, GL_TRANSFORM_FEEDBACK_VARYINGS, &iTransformFeedbackVars);
to get the number of potential variables that want to be written to a transform feedback buffer. But my opengl keeps returning 0 for "iTransformFeedbackVars". I tried calling the above command BEFORE and AFTER linking the program.
Am I missing something here or is it even possible to let the shader specify the variables it wants to write to and my code create the buffer(s) after the wishes of the shader?

VBO with and without shaders OpenGL C++

Im trying to implement modern OpenGL, but the problem is: most tutorials are based on 3.3+, and talk about GLSL 330, I only have GLSL 130. Therefore, many things are apparently different, since my VBO's do not work.
Could you give me general hints or a tutorial that explains how to use GLSL 130 with VBO's? In my case, I have the vbo loaded, but when I use my shader program, only vertices called with glVertex get rendered, it's like the vbo gets ignored (no input). How to solve this?
And can you use VBO's without shaders? I tried to do that, but it crashed...
Yes, VBOs can still be used in GLSL 130, and can still be used even without shaders. The purpose of the VBO is to hold the vertex attributes for drawing. Most up to date tutorials I've seen have you use the layout position specifier for indicating how to address the different attributes in your shader, i.e.
layout(location = 0) in vec3 Position;
This isn't supported in GLSL 130, so you have to have another way of relating the attribute with the VBO. It's pretty simple... you can use glBindAttribLocation or glGetAttribLocation. Calling glGetAttribLocation will give you the identifier you need to use in glVertexAttribPointer to associate the VBO data with the particular attribute. You can call this at any time after the program has been compiled. In addition, you can call glBindAttribLocation to specifically set the identifier that will be associated with a given attribute name if you call it after you've created the program object but before you link the shaders. This is handy because it lets you decide for yourself what the location should be, just as you would be able to with the layout specifier.
Finally, if you want to use a VBO without using a shader at all, then you still have to find a way of associating the data in the VBO with the various inputs that the fixed function pipeline expects. This is done using a now deprecated method called glEnableClientState() and glVertexPointer(), which together let you tell OpenGL what fixed function pipeline attribute you're going to populate, and how it can find the data in the VBO.

How do you use GLSL shaders in an OGRE application without using an material script?

I'm developing an application for a virtual reality environment using OGRE, Bullet and Equalizer. My rendering function looks like this:
root->_fireFrameStarted();
root->_fireFrameRenderingQueued();
root->_fireFrameEnded();
_window->update(false);
The window does not do the buffer swap, because Equalizer does that. This function works fine, we can even use particle systems and all the other fancy stuff OGRE offers.
However, since the projection area in our lab is curved, we have a GLSL module (let's call it Warp) we use in all our applications to distort the rendering output so that it fits our projection wall. We accomplish this by creating a texture, copying the contents of the back buffer to it and applying our warping shader when rendering the distorted texture covering the entire window.
You can find the source code here: pastebin . com/ TjNJuHtC
We call this module in eq::Window::frameDrawFinish() and it works well with pure OpenGL applications but not with OGRE.
This is the output without the module and its shader:
http://s1.directupload.net/images/130620/amr5qh3x.png
If I enable the module, this is the rather strange output:
http://s14.directupload.net/images/130620/74qiwgyd.png
And if I disable the two calls to glBindTexture, the texture used by the sun particle effect (flare.png) is rendered onto the screen (I would show it to you but I can only use two links).
Every GL state variable I consider relevant have the same values in our other applications.
I checked GL_READ_BUFFER, GL_DRAW_BUFFER, GL_RENDER_MODE, GL_CULL_FACE_MODE and GL_DOUBLEBUFFER.
This raises some questions: Why does my call to glTexCopySubImage2D() seem to have no effect at all? And why does my shader not do anything even if I tell it to just make every fragment red?
Supplemental: Putting the entire shader into a material script and letting OGRE handle it is not an option.
Solved the problem: I created my shaders before Ogre::Root created my windows. I have changed the order and now it works.