Texturing Not Working - c++

I am using code from this site:
http://www.spacesimulator.net/tut4_3dsloader.html
It works in their example project but when I placed the code into a class for easier and more modular use, the texture fails to appear on the object.
I've double checked to make sure the texture ID is correct by debugging them side by side.
On my project I get a blank white object while the example works fine.
Are there ANY ways to tell what is going on under the hood? Any error functions I can call that can give me some hint to what's going on? Right now I am just guessing. (Yes I have enabled 2D textures.
Thanks SO!

glGetLastError()
or glGetError()
what ever it is...
make sure glEnable(GL_TEXTURE_2D);
and make sure your texture is bound using glBindTexture
make sure there are texture coords being rendered and that they are right (if they are all the same, or all the same uninitialized value you will get one colour across the whole thing)
ummm....
make sure your texture matrix isn't screwed...
glMatrixMode(GL_TEXTURE);
glLoadIdentity();
if your not using it...
then ummm....
make sure the data getting loaded in when you load the texture is right.
make sure if you have mipmapping on that you are loading in the mip maps, otherwise if you have the object at a different zoom you might not get any texture...
umm...
thats all I can think of off the top of my head.
EDIT:
ooo, I just remembered one that caught me up once:
by changing the structure, you may have changed the initialization order of the app.
MAKE SURE you aren't trying to load textures BEFORE you initialize opengl (with the device contexts or whatever, I was under windows)

Make sure you're uploading a complete texture.

Related

OpenGL render to texture - resize rtt-texture

I'm just pulling my hair out an can't find a hint: my app resizes its RTT texture when needed via glTexImage2d with the new texture resolution.
When upsizing it all looks good. When downsizing, it looks like the TexCoord mapping of [1.0;1.0] maps to [oldRes.width; oldRes.height]. I'm sure I'm missing something vital, but cannot find it right now. Any ideas?
EDIT: oops, that wasn't it, also. My State Cache simply didn't enable the correct texturing unit on bind, when that texture already was bound (this fix also fixed different other problems).
I just found it - just too simple: obviously (I'm on NVidia) the texturing unit our RTT texture is bound to needs to be reinitialized after resize (NOT ON INITIAL SIZING). Unbinding the texture and rebinding it when needed again did the job.
P.S.: I'm working on a State-Cache that uses all available texturing units - that's why this popped up: the texture was never unbound as my examples use much less textures than there are units... (so no texture gets unbound unless deleted).

Render model at center position not displaying (revised)

Currently I can load a model of earth from a DAE file without a texture just to show that something is on the screen. To do this, I used GLU.
But now I'm trying to us GLM and shaders to load the model with a texture. The problem I'm having is properly getting them both to work together to display the model.
REVISED
After being pointed out that I needed VAOs to render an scene, I decided to go with a previously mentioned example from ogldev's tutorial 32 about Vertex Array Objects. I compiled the tutorial and it works as it should, but it is GLUT. So I extracted the vital parts pertaining to loading the models and implemented them into my project. After some debugging, I've managed to end up further than previously described, but still nothing shows. And after looking back and forth as to what I may have missed, I can't pinpoint the exact problem. I made sure that the program is reading values and doing proper checks that show that the procedure is valid with no hiccups, but I for some reason, I can't get anything to show up. I've changed the background color to check if there was just a silhouette of the earth model, but nothing still rendered to the screen.
What I have done is provided a copy of I've done. Everything is showing an appropriate value. If you compare the tutorial versus mine, you will see that it's the same thing and besides mine hard-coding the camera position and target to point to where the model is to be, there is really nothing different. But I don't know what I may have missed/overlooked in the process.
(non expiring)
game.cpp
Everything within OnInit() passes. But when it comes to OnRender(), Something isn't right.
PipeLine.cpp
No different from the tutorial.
model.cpp
Other than the class name, no different than the tutorial.
Technique.h & cpp
Shaders and Shader Program. No different than the tutorial.
Lighting.h & cpp
Child of Technique class. Gets uniforms, etc.
The main problem seems to be in your fragment definition:
fragData = color * texture(tex, TexCoord);
where color is defined as an uniform. In your main program you don't allocated any value to it, so, it is initialized to zero, which makes any fragment be black. The same color of your framebuffer background.
Despite this, there are others issues with your code. First, you are using a lot of deprecated functions that has no effect in core profile (v3.3).
Second you need to allocate a VAO (Vertex Array Object) to be able to see anything.
And, finally, some functions seems not to have any effects, like, for example:
glm::translate(view, glm::vec3(0.f, 0.f, -20.f)); // <-----------
// Draw model
m_model.Render();
It would easier to help you if you provide the implemantation of your render function.

OpenGL3 two sets of shaders, texture showing black

I've recently succeeded at making a small test app with a GL_TEXTURE_RECTANGLE. Now I'm trying to integrate it into my larger project, but when I call glBindTexture(GL_TEXTURE_RECTANGLE, _tex_id[0]) inside the render function, it's causing the GL_INVALID_OPERATION​ error. The texture image sometimes shows for a fraction of a second, then turns black and stays black.
I am trying to do this by using two sets of vertex and fragment shaders, one set for the 3D scene, and one set for the 2D overlay, but I've never tried this before so I don't know if that's what's causing the error, or if I should be going about this a different way. The shaders are all compiling and linking fine.
Any insight would be much appreciated, and if it would help to see some code, let me know and I'll post some of it (although I think it may be too much for anyone to reasonably look through).
Edit: gDEBugger breaks at the call to glBindTexture(), and when clicking on the breakpoint, the properties window shows a picture of one of my other textures (one that's being loaded by the 3D scene's shaders), it shows that it's trying to load texture number 1, but I know this number is already being used to draw the same 3D scene's texture shown in the properties window... why would glGenTextures() be giving me overlapping texture id numbers? Is this normal or maybe part of the problem?
The black texture was due to me not forwarding some vertex shader inputs (normals) through to the fragment shader, even though I'm not using normals for anything in the 2D overlay shaders. As soon as I added outputs for all the inputs, and forwarded them along to the fragment shader, the texture was no longer black, but it was still disappearing after a fraction of a second. This was because I was calling glBindTexture(GL_TEXTURE_RECTANGLE, 0) at the end of the render function with the hopes that it would clean up some state... this was clearly the wrong thing to do, because removing that call caused the 2D texture to stay on-screen. Furthermore, calling glBindTexture() with the GL_TEXTURE_RECTANGLE target seems to work during the texture setup stage, but during rendering the GL_TEXTURE_RECTANGLE target was causing the GL_INVALID_OPERATION​ error. Changing the target to GL_TEXTURE_2D only in the render function made the error go away, and everything seems to work nicely now.

OpenGL texture mapping on sides cube using GL_QUADS

I am trying to map a different texture on each side of a cube using a GL_QUADS. My first problem is that I cannot even get a texture to display on the side of a GL_QUADS. I can however get a texture to display using GL_TRIANGLES but I do no understand how to draw things very well using triangles and I want to use QUADS. I also can only use GLUT for this. I need an example that works because I do not know enough about OpenGL for someone to simply explain this to me. Someone please help. Thanks!
Oops didn't realize I forgot to use glTexCoord2f. It works now.
If you post the code that you are having trouble with, perhaps I can help you. Most likely, you need to set the appropriate texture coordinates per-vertex.

How do I give GLUT cubes different colors?

Just what it says: I have some code that's drawing GLUT cubes, but they're all grey. How do I make them different colors?
Everything I've tried so far has failed. I think my problem is that I'm trying to use OpenGL functions to change their colors, but GLUT is maintaining it's own internal color or material data and I don't know how to make it change that data.
This is just filler graphics for a test-client for an online game, so they don't have to look good, I just need to be able to tell things apart. I know GLUT isn't considered great, so if anyone wants to post an example of drawing a cube with plain OpenGL instead of glutCube I'm all ears. I don't really care how I get the cubes on the screen, and it's not a part of the code I want to spend a lot of time on. I have a partner who's doing the real graphics; I just need to get something showing so that I can visualize what my code is doing.
The language I'm using OpenGL/GLUT from is called Io, but the API it exposes should be the same as if I were calling it from C.
It turns out that if I just do:
glEnable(GL_COLOR_MATERIAL)
then it makes the material track whatever color I set with glColor, even when lighting is enabled.
Just set the color beforehand with glColor(). If you're using lighting (i.e. GL_LIGHTING is enabled), though, then you'll instead have to use glMaterial() to set the cube's color.