Simple task: draw a fullscreen quad with texture, nothing more, so we can be sure the texture will fill whole screen space. (We will do some more shader magic later).
Drawing fullscreen quad with simple fragment shader was easy, but now we are stuck for a whole day trying to make it textured. We read plenty of tutorials, but none of them helped us. Theose about sdl are mainly using opengl 1.x, those about OpenGL 2.0 are not about texturing, or SDL. :(
The code is here. Everything is in colorLUT.c, and fragment shader is in colorLUT.fs. The result is window of the same size as image, and if you comment the last line in shader, you get nice red/green gradient, so the shader is fine.
Texture initialization hasn't changed compared to OpenGL 1.4. Tutorials will work fine.
If fragment shader works, but you don't see texture (and get black screen), texture loading is broken or texture hasn't been set correctly. Disable shader, and try displaying textured polygon with fixed-function functionality.
You may want to call glPixelStorei(GL_UNPACK_ALIGNMENT, 1) before trying to init texture. Default value is 4.
Easier way to align texture to screen is to add vertex shader and pass texture coordinates - instead of trying to calculate them using gl_FragCoord.
You're passing surface size into "resolution" uniform. This is an error. You should be passing viewport size instead.
You may want to generate mipmaps. Either generate them yourself, or use GL_GENERATE_MIPMAPS because it is available in OpenGL 2 (but has been deprecated in later versions)
OpenGL.org has specifications for OpenGL 2.0 and GLSL 1.5. Download them and use them as reference, when in doubt.
NVIdia OpenGL SDK has examples you may want to check - they cover shaders.
And there's "OpenGL Orange book" (OpenGL shading language) which specifically deals with shaders.
Next time include code into question.
Related
Is there any way to draw a texture to the screen without using a shader? Something like the immediate mode in gl3.1 (glBegin, glTexCoord, etc). I know that VA and IB are necessary but what about the shader? I've just needed to show a simple texture fullscreen.
So basically no. The proper OpenGL 3 way is to make a shader.
I'm curently learning to create shadown using GLSL but I have some troubles here:
1. In GLSL 3.3, we can use this statement in fragment shader:
layout(location = 0) out float fragmentdepth;
to write only depth 16bit out to texture (setted as GL_DEPTH_COMPONENT16 before) but how can I do some thing like that in OpenGL 2.1 (GLSL 1.20)?
As far as I know, for rendering depth buffer, we only need change the camera position to light position and camera direction to light direction and changed back if we are drawing real scene, Is it right?
In GLSL 3.3, we can use this statement in fragment shader:
to write only depth 16bit out to texture (setted as GL_DEPTH_COMPONENT16 before)
No, you can't.
That will set fragmentdepth to write to a color buffer. And you cannot attach an image with the GL_DEPTH_COMPONENT16 image format to a GL_COLOR_ATTACHMENTi attachment point of an FBO. Attempting to do so will give you a GL_FRAMEBUFFER_INCOMPLETE_ATTACHMENT error.
In all versions of GLSL, the only way to write to the attached depth buffer is to write to gl_FragDepth. And in all versions of GLSL, you can only have one depth buffer attached to the FBO. Though Image Load/Store does allow you to work around that, but you lose depth testing and such.
As stated here you need to use FBO extension (EXT) if targeting OpenGL 2.1.Also why on earth are you still using fixed pipeline?It is deprecated .If your hardware allows you - use OpenGL 3.3+ and then leverage (core) FBOs with texture attachments (or renderbuffers) to which you can draw depth buffer data.Yes,you can still do the same with the deprecated profile but well,the modern OpenGL made a huge step forward since then.
Anyways , here is what you need for your version.
I've been having a problem trying to get OpenGL 3.2 to work and after spending a few hours trying to figure out what was wrong I realized that it does not support glBegin. I use that command probably about 50-100 times in my engine to draw full screen quads and GUI elements. So what is a simple way to just draw a rectangle with OpenGL 3.2? Do I actually have to create a vertex buffer, fragment shader, and vertex shader to do something so simple?!
Do I actually have to create a vertex buffer, fragment shader, and vertex shader to do something so simple?!
Yep, no freebies in Core profile.
Question: How do I render points in openGL using GLSL?
Info: a while back I made a gravity simulation in python and used blender to do the rendering. It looked something like this. As an exercise I'm porting it over to openGL and openCL. I actually already have it working in openCL, I think. It wasn't until i spent a fair bit of time working in openCL that I realized that it is hard to know if this is right without being able to see the result. So I started playing around with openGL. I followed the openGL GLSL tutorial on wikibooks, very informative, but it didn't cover points or particles.
I'm at a loss for where to start. most tutorials I find are for the openGL default program. I want to do it using GLSL. I'm still very new to all this so forgive me my potential idiocy if the answer is right beneath my nose. What I'm looking for is how to make halos around the points that blend into each other. I have a rough idea on how to do this in the fragment shader, but so far as I'm aware I can only grab the pixels that are enclosed by polygons created by my points. I'm sure there is a way around this, it would be crazy for there not to be, but me in my newbishness is clueless. Can some one give me some direction here? thanks.
I think what you want is to render the particles as GL_POINTS with GL_POINT_SPRITE enabled, then use your fragment shader to either map a texture in the usual way, or generate the halo gradient procedurally.
When you are rendering in GL_POINTS mode, set gl_PointSize in your vertex shader to set the size of the particle. The vec2 variable gl_PointCoord will give you the coordinates of your fragment in the fragment shader.
EDIT: Setting gl_PointSize will only take effect if GL_PROGRAM_POINT_SIZE has been enabled. Alternatively, just use glPointSize to set the same size for all points. Also, as of OpenGL 3.2 (core), the GL_POINT_SPRITE flag has been removed and is effectively always on.
simply draw a point sprites (using GL_POINT_SPRITE) use blending functions: gl_src_alpha and gl_one and then "halos" should be visible. Blending should be responsible for "halos" so look for some more info about that topic.
Also you have to disable depth wrties.
here is some link about that: http://content.gpwiki.org/index.php/OpenGL:Tutorials:Tutorial_Framework:Particles
I'm having a problem getting accurate primitive colours when I'm using multi-texturing elsewhere in the scene. Basically, I have some lines and polygons that I am trying render over a video texture (I'm using 3 stage multitexturing to create the video texture)... Anyhow, I know the problem is not alpha related... In fact, I know that in my texture update function if I just comment out the calls to glBindTexture() for texture levels 1 and 2, the primitive color is fine (so leaving texture level 0)... Is it trying to multitexture the primitives too (even though I'm obviously not setting texture coordinates for primitives)?
Make sure to disable multitexturing when not using it. OpenGL uses a state machine, so if you turn on a texture it will stay on until you explicitly turn it off.
Just because you're not setting coordinates, doesn't mean OpenGL will assume you're not using the texture.