Currently I can load a model of earth from a DAE file without a texture just to show that something is on the screen. To do this, I used GLU.
But now I'm trying to us GLM and shaders to load the model with a texture. The problem I'm having is properly getting them both to work together to display the model.
REVISED
After being pointed out that I needed VAOs to render an scene, I decided to go with a previously mentioned example from ogldev's tutorial 32 about Vertex Array Objects. I compiled the tutorial and it works as it should, but it is GLUT. So I extracted the vital parts pertaining to loading the models and implemented them into my project. After some debugging, I've managed to end up further than previously described, but still nothing shows. And after looking back and forth as to what I may have missed, I can't pinpoint the exact problem. I made sure that the program is reading values and doing proper checks that show that the procedure is valid with no hiccups, but I for some reason, I can't get anything to show up. I've changed the background color to check if there was just a silhouette of the earth model, but nothing still rendered to the screen.
What I have done is provided a copy of I've done. Everything is showing an appropriate value. If you compare the tutorial versus mine, you will see that it's the same thing and besides mine hard-coding the camera position and target to point to where the model is to be, there is really nothing different. But I don't know what I may have missed/overlooked in the process.
(non expiring)
game.cpp
Everything within OnInit() passes. But when it comes to OnRender(), Something isn't right.
PipeLine.cpp
No different from the tutorial.
model.cpp
Other than the class name, no different than the tutorial.
Technique.h & cpp
Shaders and Shader Program. No different than the tutorial.
Lighting.h & cpp
Child of Technique class. Gets uniforms, etc.
The main problem seems to be in your fragment definition:
fragData = color * texture(tex, TexCoord);
where color is defined as an uniform. In your main program you don't allocated any value to it, so, it is initialized to zero, which makes any fragment be black. The same color of your framebuffer background.
Despite this, there are others issues with your code. First, you are using a lot of deprecated functions that has no effect in core profile (v3.3).
Second you need to allocate a VAO (Vertex Array Object) to be able to see anything.
And, finally, some functions seems not to have any effects, like, for example:
glm::translate(view, glm::vec3(0.f, 0.f, -20.f)); // <-----------
// Draw model
m_model.Render();
It would easier to help you if you provide the implemantation of your render function.
Related
I'm writing a little OpenGl program with Glsl.
Now I have two objects which I need to draw. Both have different shaders.
Normally I think I should do something like that in my draw() method:
void draw() {
shaderObjektOne.bind();
glBegin(xxx);
//draw Object one
...
glEnd()
shaderObjektTwo.bind();
glBegin(xxx);
//draw Object two
...
glEnd()
}
If I do that this way my screen freezes.
Binding the shader for only one object is working without problems.
I've been looking around but I couldn't find a real explanation why this error occurs.
Is it because a render target can only be rendered with one shader?
How can I avoid a giant shader file or having multiple render targets?
Thanks.
EDIT:
I want to have separate compiled shader programs for each object. These will be bound right before I draw the vertexes for the object. I want to avoid one big shader in which I need to set specific parameters to choose the functionality for the object. I use glut and currently all drawing is done before glutSwapBuffers().
'Freezing' means that there is actual something visible on the screen (the last object I've drawn with the last bound shader) but my input isn't working anymore. That means, I cannot move the camera in the world but the program is still running normal (tested with a debugger).
Got it.
It was a problem with my program design. I accidentally added a copy of the shader which I wanted to bind.
Everytime I tried to bind the shader it bound the copy of it.
Thanks for your help guys :)
I'm loading some scenes/objects from files using assimp, and I had them displaying properly earlier — but rewrote my MVP matrix setup (which had been terribly written and was incomprehensible).
Now, most primitives which I draw in the standard rendering pipeline seem to be appearing just fine. I have a wireframe cube around the origin and can also put in a triangle. But no matter what I do, my ASSIMP-loaded object refuses to be rendered, as a wireframe or as a solid.
I suspect the mistake I'm making is terribly obvious. I've tried to reduce the code to a minimal example.
The object should look like a rock and it should show up within the wireframe box.
Since I haven't much altered the mesh code, I'm guessing the problem is in scene.h or main.cpp.
The old version had GLSL programs, but I eliminated all mention of those here. My understanding from the OpenGL Superbible is that shaders aren't required, though. So that can't be it, right?
The old version had GLSL programs, but I eliminated all mention of those here. My understanding from the OpenGL Superbible is that shaders aren't required, though.
They are if you want to use generic vertex attributes via glVertexAttribPointer(). Without a shader OpenGL has no way of knowing attribute 0 is a vertex or 1 contains a texture coordinate.
Use glVertexPointer() and friends if you don't want to use shaders.
I'm developing an application for a virtual reality environment using OGRE, Bullet and Equalizer. My rendering function looks like this:
root->_fireFrameStarted();
root->_fireFrameRenderingQueued();
root->_fireFrameEnded();
_window->update(false);
The window does not do the buffer swap, because Equalizer does that. This function works fine, we can even use particle systems and all the other fancy stuff OGRE offers.
However, since the projection area in our lab is curved, we have a GLSL module (let's call it Warp) we use in all our applications to distort the rendering output so that it fits our projection wall. We accomplish this by creating a texture, copying the contents of the back buffer to it and applying our warping shader when rendering the distorted texture covering the entire window.
You can find the source code here: pastebin . com/ TjNJuHtC
We call this module in eq::Window::frameDrawFinish() and it works well with pure OpenGL applications but not with OGRE.
This is the output without the module and its shader:
http://s1.directupload.net/images/130620/amr5qh3x.png
If I enable the module, this is the rather strange output:
http://s14.directupload.net/images/130620/74qiwgyd.png
And if I disable the two calls to glBindTexture, the texture used by the sun particle effect (flare.png) is rendered onto the screen (I would show it to you but I can only use two links).
Every GL state variable I consider relevant have the same values in our other applications.
I checked GL_READ_BUFFER, GL_DRAW_BUFFER, GL_RENDER_MODE, GL_CULL_FACE_MODE and GL_DOUBLEBUFFER.
This raises some questions: Why does my call to glTexCopySubImage2D() seem to have no effect at all? And why does my shader not do anything even if I tell it to just make every fragment red?
Supplemental: Putting the entire shader into a material script and letting OGRE handle it is not an option.
Solved the problem: I created my shaders before Ogre::Root created my windows. I have changed the order and now it works.
Im currently writing a puzzle game in c++ directX 9. Not much of it has been a problem however some of my .x files that I am using (using a mesh class that reads them in etc) seems to overwrite the colours of other stuff.
For example I have a green floor and a white pointer, on a level that has a Diglett looking character that has been made in 3ds and textured then exported to .x using panda plugin, other items that are unrelated start to change colour, the green floor is now a lot darker and the white pointer is brown?
Anyone have any ideas? not sure if its texture overflow or something?
The most likely explanation given the information here is that the mesh is changing some state (such as: shaders, diffuse color render/stage states, etc.) when it is drawn. Then your other geometry is affected by those states. You should make sure that any state your geometry depends on is set to what you want it to be before rendering to avoid it being affected by previously changed state.
I am trying to switch my OpenGL application from the old fixed function system to using Vertex Buffer Objects. However, with my current setup nothing is displaying on the screen. I'm sure I'm making some simple error, but I can't see it.
gltest.h
gltest.cpp
model and index hold the IDs for my vbo and ibo respectively. The buffer objects are set up in the GLTest::makeModel method. The struct i'm using to store vertex data has 3 floats for the position, followed by 4 unsigned chars for the color.
It creates three vertices arranged in a triangle, and the buffer object simply contains the numbers 0,1,and 2. I call the method with a QRgb object containing the color blue, so with this setup, I would expect to see a blue triangle displayed on screen. Instead, nothing is displayed.
A full Qt project which shows the error is available here. You will need GLEW installed.
I never programmed with the fixed-pipeline versions of OpenGL, but I've been doing lots of work in v3.0+, so take my advice carefully!
You seem to be mixing old and new together, for example you don't have a vertex or fragment shader loaded. Your glEnableClientState(), glMatrixMode(), glLoadIdentity(), glVertexPointer(), glColorPointer() are depreciated in modern OpenGL; having been replaced by shader functionality.
Also whenever I get stuck with this sort thing, I litter my gl calls with glGetError(), you only have one.