OpenGL SOIL error - c++

I'm working on a university project and it requires us to load up a cube model, texture it and do some other stuff with it.
We have been provided with a basic framework that uses SOIL to load up textures into OpenGL.
However, when I call the function:
SOIL_load_OGL_texture("Barren Reds.JPG", SOIL_LOAD_AUTO, SOIL_CREATE_NEW_ID, SOIL_FLAG_MIPMAPS);
I get the following error:
OpenGL Debug Output: Source(OpenGL), Type(Error), Priority(High), Error has been generated. GL error GL_INVALID_ENUM in GetString: (ID: 491340553) Generic error
OpenGL Debug Output: Source(OpenGL), Type(Error), Priority(High), Error has been generated. GL error GL_INVALID_ENUM in TexParameteri: (ID: 2102148481) Generic error
OpenGL Debug Output: Source(OpenGL), Type(Error), Priority(High), Error has been generated. GL error GL_INVALID_ENUM in TexParameteri: (ID: 2102148481) Generic error
The thing is, I have another framework that uses SOIL too and when I run the same function with the same texture, it works fine. So I figured my SOIL build is not good, so I copied the working SOIL build to my project and still the same error.
I get these 3 lines of error whenever I call the function, so if I call it to create 3 textures I get it 3 times.

If you're using a Core context be aware that SOIL's query_tex_rectangle_capability() unconditionally calls glGetString(GL_EXTENSIONS) (GL_EXTENSIONS is not a valid argument for glGetString() in Core contexts and will generate a GL_INVALID_ENUM) instead of using glGetStringi() to iterate over extension strings.
Your options are:
Fix SOIL, or
Use stb_image.h directly and handle texture upload yourself, or
Use a Compatibility context (where glGetString(GL_EXTENSIONS) is still valid usage)

Related

glGenTextures gives GL_INVALID_OPERATION despite valid OpenGL context

I get a GL_INVALID_OPERATION error when calling glGenTextures and I have no idea what could be responsible for it.
I am using QtOpenGLWidget to retrieve the context and it looks valid at the time I call glGenTextures() (at least I have one since glGetString(GL_VERSION) and glxGetCurrentContext() both return something which is not crap)
The faulty code is called from the QOpenGLWidget::resizeGL() method. In the QOpenGLWidget::initializeGL() I compile successfully some shader programs and I create / upload data to VAO / VBOs.
So my questions are :
What are the common faulty cases of glGenTextures() except not having an OpenGL context at all
Can an OpenGL context be invalid or messed up and, in such a case
How can I check that my OpenGL context is valid ?
EDIT: Since I strongly believe this is related to the fact my machine has no proper GPU, here is the return of glxinfo.
I don't think that it is the glGenTextures() call, that causes the error. This call can only throw GL_INVALID_ENUM and GL_INVALID_VALUE. Most likely some other call is wrong and the bind() of the new texture is the invalid call.
Try to localise your offending call with glGetError().
You can check the documentation for possible failures of gl calls here:
https://www.opengl.org/sdk/docs/man4/html/glCreateTextures.xhtml
Ok found the problem. It appears a shader was silently not compiling properly (hardcoded shader not checked for proper compilation, with incorrect #version) and that was messing up the next OpenGL error check which was the glGenTextures().

Failed to create D3D shaders - webGL GLSL

I've been checking out the cool animations on GLSL Sandbox but one of the demo isn't running for me, this one. The error isn't in compilation though, but at runtime - it says that it Failed to create D3D shaders. I looked into the sandbox source and I can see that it directly passes on the exception from WebGL, so it's not a problem with the website but with the code.
I walked through the code but it's rather long and I see no mention of D3D shaders. It looks much like the other demos which are working fine.
So...
What part of the code causes the problem? Could I comment something out to get it to work?
Is the problem browser dependent? Or does it depend on my OpenGL version?

Qt 5.5 QOpenGLTexture copying data issue

I'm working with Qt 5.5 OpenGL wrapper classes. Specifically trying to get QOpenGLTexture working. Here I am creating a 1x1 2D white texture for masking purposes. This works:
void Renderer::initTextures()
{
QImage white(1, 1, QImage::Format_RGBA8888);
white.fill(Qt::white);
m_whiteTexture.reset(new QOpenGLTexture(QOpenGLTexture::Target2D));
m_whiteTexture->setSize(1, 1);
m_whiteTexture->setData(white);
//m_whiteTexture->allocateStorage(QOpenGLTexture::RGBA, QOpenGLTexture::UInt32);
//m_whiteTexture->setData(QOpenGLTexture::RGBA, QOpenGLTexture::UInt8, white.bits());
// Print any errors
QList<QOpenGLDebugMessage> messages = m_logger->loggedMessages();
if (messages.size())
{
qDebug() << "Start of texture errors";
foreach (const QOpenGLDebugMessage &message, messages)
qDebug() << message;
qDebug() << "End of texture errors";
}
}
However I am now trying to do two things:
Use allocate + setData sequence as separate commands (the commented out lines), e.g.
m_whiteTexture->allocateStorage(QOpenGLTexture::RGBA, QOpenGLTexture::UInt32);
m_whiteTexture->setData(QOpenGLTexture::RGBA, QOpenGLTexture::UInt8, white.bits());
for the purpose of more complicated rendering later where I just update part of the data and not reallocate. Related to this is (2) where I want to move to Target2DArray and push/pop textures in this array.
Create a Target2DArray texture and populate layers using QImages. Eventually I will be pushing/popping textures up to some max size available on the hardware.
Regarding (1), I get these errors from QOpenGLDebugMessage logger:
Start of texture errors
QOpenGLDebugMessage("APISource", 1280, "Error has been generated. GL error GL_INVALID_ENUM in TextureImage2DEXT: (ID: 2663136273) non-integer <format> 0 has been provided.", "HighSeverity", "ErrorType")
QOpenGLDebugMessage("APISource", 1280, "Error has been generated. GL error GL_INVALID_ENUM in TextureImage2DEXT: (ID: 1978056088) Generic error", "HighSeverity", "ErrorType")
QOpenGLDebugMessage("APISource", 1281, "Error has been generated. GL error GL_INVALID_VALUE in TextureImage2DEXT: (ID: 1978056088) Generic error", "HighSeverity", "ErrorType")
QOpenGLDebugMessage("APISource", 1281, "Error has been generated. GL error GL_INVALID_VALUE in TextureSubImage2DEXT: (ID: 1163869712) Generic error", "HighSeverity", "ErrorType")
End of texture errors
My mask works with the original code, but I can't get it to work in either (1) and (2) scenarios. For (2) I change the target to Target2DArray, change the size to include depth of 1, adjust my shaders to use vec3 texture coordinates and sampler3D for sampling, etc. I can post a more complete (2) example if that helps. I also don't understand these error codes and obviously difficult to debug on the GPU if that is what is going wrong. I've tried all sorts of PixelType and PixelFormat combinations.
Thanks!
This question is very old, but I just came across a similar problem myself. For me the solution was to call setFormat before
m_whiteTexture->setFormat(QOpenGLTexture::RGBA8_UNorm);
As I found out here: https://www.khronos.org/opengl/wiki/Common_Mistakes#Creating_a_complete_texture
The issues with the original code, is that the texture is not complete.
As mentioned by #flaiver, using QOpenGLTexture::RGBA8_UNorm works, but only because Qt uses different kind of storage for this texture (effectively it uses glTexStorage2D, and that is even better, as per OpenGL documentation), which is not the case for QOpenGLTexture::RGBA.
To make the texture work, even if you do require specifically QOpenGLTexture::RGBA (or some other formats, e.g. QOpenGLTexture::AlphaFormat), you need either set texture data for each mipmap level (which you don't really need for your case), or disable using mipmaps:
// the default is `QOpenGLTexture::NearestMipMapLinear`/`GL_NEAREST_MIPMAP_LINEAR`,
// but it doesn't work, if you set data only for level 0
// alternatively use QOpenGLTexture::Nearest if that suits your needs better
m_whiteTexture->setMagnificationFilter(QOpenGLTexture::Linear);
m_whiteTexture->setMinificationFilter(QOpenGLTexture::Linear);
// // optionally a good practice is to explicitly set the Wrap Mode:
// m_whiteTexture->setWrapMode(QOpenGLTexture::ClampToEdge);
right after you allocate the storage for texture data.

Cg problems with OpenGL

I'm working on an OpenGL project on Windows, using GLEW to provide the functionality the provided Windows headers lack. For shader support, I'm using NVIDIA's Cg. All the documentation and code samples I have read indicate that the following is the correct method for loading an using shaders, and I've implemented things this way in my code:
Create a Cg context with cgCreateContext.
Get the latest vertex and pixel shader profiles using cgGLGetLatestProfile with CG_GL_VERTEX and CG_GL_FRAGMENT, respectively. Use cgGLSetContextOptimalOptions to create the optimal setup for both profiles.
Using these profiles and shaders you have written, create shader programs using cgCreateProgramFromFile.
Load the shader programs using cgGLLoadProgram.
Then, each frame, for an object that uses a given shader:
Bind the desired shader(s) (vertex and/or pixel) using cgGLBindProgram.
Enable the profile(s) for the desired shader(s) using cgGLEnableProfile.
Retrieve and set any needed uniform shader parameters using cgGetNamedParameter and the various parameter setting functions.
Render your object normally
Clean up the shader by calling cgGLDisableProfile
However, things start getting strange. When using a single shader everything works just fine, but the act of loading a second shader with cgGLLoadProgram seems to make objects using the first one cease to render. Switching the draw order seems to resolve the issue, but that's hardly a fix. This problem occurs on both my and my partner's laptops (fairly recent machines with Intel integrated chipsets).
I tested the same code on my desktop with a GeForce GTX 260, and everything worked fine. I would just write this off as my laptop GPU not getting along with Cg, but I've successfully built and run programs that use several Cg shaders simultaneously on my laptop using the OGRE graphics engine (unfortunately the assignment I'm currently working on is for a computer graphics class, so I can't just use OGRE).
In conclusion, I'm stumped. What is OGRE doing that my code is not? Am I using Cg improperly?
You have to call cgGLEnableProfile before you call cgGLBindProgram. From your question it appears you do it the other way around.
From the Cg documentation for cgGLBindProgram:
cgGLBindProgram binds a program to the current state. The program must have been loaded with cgGLLoadProgram before it can be bound. Also, the profile of the program must be enabled for the binding to work. This may be done with the cgGLEnableProfile function.

glColor, glMatrixMode mysteriously giving "Invalid operation" errors

Recently my game-engine-in-progress has started throwing OpenGL errors in places that they shouldn't be possible. After rendering a few frames, suddenly I start getting errors from glColor:
print(gl.GetError()) --> nil
gl.Color(1, 1, 1, 1)
print(gl.GetError()) --> INVALID_OPERATION
If I don't call glColor here, I later get an invalid operation error from glMatrixMode.
According to the GL manual, glColor should never raise an error, and glMatrixMode only if it's between glBegin and glEnd, which I've checked is not the case. Are there any other reasons these functions can raise an error, that I'm not aware of? Maybe related to render-to-texture/renderbuffer extensions? I've been debugging like mad and can't find anything that should cause such failures. The whole program is a bit too large and complex to post here. It's using luagl, which is just a thin wrapper around the OpenGL API, and SDL. The reported version is: 2.1 Mesa 7.10.2
glColor will result in an error if there is no active OpenGL context. If you are using multiple contexts or glBindFramebuffer check that you always switch ones that are valid. Also remember that using OpenGL calls from multiple threads require special attention.
https://bugs.freedesktop.org/show_bug.cgi?id=48535
Looks like this was actually a driver bug. >.>