glVertexAttribDivisor not present - c++

The function glVertexAttribDivisor when used with CMake - it just throws the following error during compilation, which means that function is not imported.
error: 'glVertexAttribDivisor' was not declared in this scope; did you mean 'glVertexAttrib4iv'
I am currently using GLAD for loading openGL, like below:
if (!gladLoadGLLoader((GLADloadproc) glfwGetProcAddress)) {
std::cout << "Failed to initialize OpenGL context" << std::endl;
return nullptr;
}
int gladInitRes = gladLoadGL();
if (!gladInitRes) {
fprintf(stderr, "Unable to initialize glad\n");
glfwDestroyWindow(window);
glfwTerminate();
return nullptr;
}
And printf("OpenGL version %i.%i\n", GLVersion.major, GLVersion.minor); prints out OpenGL version 4.2

Related

glfwInit() freezes program

The program gets stuck at the glfwInit() function. However, there are no error messages and glfwInit() doesn't come to a stop. It's like the main thread is sleeping or run into an infinite loop. This problem seems not to be project-specific because I tried running a program that I hadn't changed in days where the same problem occurred (this program worked also fine). I am using Visual Studio 2019.
Here is the code of my main-function which should initiate GLFW:
int main() {
int width, height;
//Everything until this function will not execute and the program won't stop from its self.
if (!glfwInit()) {
std::cout << "ERROR::GLFW::Could not be initialized!" << std::endl;
}
setUpWindow();
GLFWwindow* window = createWindow(800, 800, "OpenGL Advanced");
if (window == nullptr) {
glfwTerminate();
return 2;
}
glfwMakeContextCurrent(window);
setUpListener(window);
bool gladIsLoaded = loadGlad();
if (!gladIsLoaded) {
glfwTerminate();
return 3;
}
setUpOpenGL();
startRenderLoop(&width, &height, window);
glfwTerminate();
return 1;
}
This is the call stack:
ntdll.dll!NtDeviceIoControlFile() Unknown
KernelBase.dll!DeviceIoControl() Unknown
kernel32.dll!DeviceIoControlImplementation() Unknown
hid.dll!00007ff86b631c2b() Unknown
hid.dll!00007ff86b631a1b() Unknown
dinput8.dll!00007ff83367492b() Unknown
dinput8.dll!00007ff833674648() Unknown
dinput8.dll!00007ff833674401() Unknown
dinput8.dll!00007ff833671f87() Unknown
dinput8.dll!00007ff83367424d() Unknown
dinput8.dll!00007ff833671037() Unknown
dinput8.dll!00007ff833678f1f() Unknown
dinput8.dll!00007ff8336790c6() Unknown
OpenglAdvanced.exe!_glfwInitJoysticksWin32() C
OpenglAdvanced.exe!_glfwPlatformInit() C
OpenglAdvanced.exe!glfwInit() C
OpenglAdvanced.exe!main() Line 48 C++
OpenglAdvanced.exe!invoke_main() Line 79 C++
OpenglAdvanced.exe!__scrt_common_main_seh() Line 288 C++
OpenglAdvanced.exe!__scrt_common_main() Line 331 C++
OpenglAdvanced.exe!mainCRTStartup() Line 17 C++
kernel32.dll!BaseThreadInitThunk() Unknown
ntdll.dll!RtlUserThreadStart() Unknown
I had the same issue, updating to the latest version (from 3.3.2 to 3.3.6) fixed the problem.

Error GLSL incorrect version 450

I have a certain OpenGL application which I compiled in the past but now can't in the same machine. The problem seems to be in the fragment shader not compiling properly.
I'm using:
Glew 2.1.0
Glfw 3.2.1
Also all necessary context is being created on the beginning of the program. Here's how my program creation function looks like:
std::string vSource, fSource;
try
{
vSource = getSource(vertexShader, "vert");
fSource = getSource(fragmentShader, "frag");
}
catch (std::runtime_error& e)
{
std::cout << e.what() << std::endl;
}
GLuint vsID, fsID;
try
{
vsID = compileShader(vSource.c_str(), GL_VERTEX_SHADER); //Source char* was checked and looking good
fsID = compileShader(fSource.c_str(), GL_FRAGMENT_SHADER);
}
catch (std::runtime_error& e)
{
std::cout << e.what() << std::endl; //incorrect glsl version 450 thrown here
}
GLuint programID;
try
{
programID = createProgram(vsID, fsID); //Debugging fails here
}
catch (std::runtime_error& e)
{
std::cout << e.what() << std::endl;
}
glDeleteShader(vsID);
glDeleteShader(fsID);
return programID;
My main:
/* ---------------------------- */
/* OPENGL CONTEXT SET WITH GLEW */
/* ---------------------------- */
static bool contextFlag = initializer::createContext(vmath::uvec2(1280, 720), "mWs", window);
std::thread* checkerThread = new std::thread(initializer::checkContext, contextFlag);
/* --------------------------------- */
/* STATIC STATE SINGLETON DEFINITION */
/* --------------------------------- */
Playing Playing::playingState; //Failing comes from here which tries to create a program
/* ---- */
/* MAIN */
/* ---- */
int main(int argc, char** argv)
{
checkerThread->join();
delete checkerThread;
Application* app = new Application();
...
return 0;
}
Here is the looking of an example of the fragmentShader file:
#version 450 core
out vec4 fColor;
void main()
{
fColor = vec4(0.5, 0.4, 0.8, 1.0);
}
And this is what I catch as errors:
[Engine] Glew initialized! Using version: 2.1.0
[CheckerThread] Glew state flagged as correct! Proceeding to mainthread!
Error compiling shader: ERROR: 0:1: '' : incorrect GLSL version: 450
ERROR: 0:7: 'fColor' : undeclared identifier
ERROR: 0:7: 'assign' : cannot convert from 'const 4-component vector of float' to 'float'
My specs are the following:
Intel HD 4000
Nvidia GeForce 840M
I shall state that I compiled shaders in this same machine before. I can't do it anymore after a disk format. However, every driver is updated.
As stated in comments the problem seemed to be with a faulty option of running the IDE with selected graphics card. As windows defaults the integrated Intel HD 4000 card, switching the NVIDIA card to the default preferred one by the OS fixed the problem.

Why is OpenGL version 0.0?

I was troubleshooting an OpenGL application on a new computer when I discovered that GLFW could not create a window with the specified version of OpenGL. I created a minimal version of the application to test the version of OpenGL created, and no matter what version I hint, the version I get is 0.0. Do I simply not have OpenGL? This seems impossible, since glxgears runs and glxinfo suggests that I have version 2.1.
#include <iostream>
#include <GLFW/glfw3.h>
int main(int argc, const char *argv[]) {
if(!glfwInit()) {
return 1;
}
glfwWindowHint(GLFW_CONTEXT_VERSION_MAJOR, 2);
glfwWindowHint(GLFW_CONTEXT_VERSION_MINOR, 1);
auto win = glfwCreateWindow(640, 480, "", NULL, NULL);
if(!win) {
return 1;
}
int major = 0, minor = 0;
glfwMakeContextCurrent(win);
glGetIntegerv(GL_MAJOR_VERSION, &major);
glGetIntegerv(GL_MINOR_VERSION, &minor);
std::cout << "Initialized with OpenGL "
<< major << "." << minor << std::endl;
glfwDestroyWindow(win);
glfwTerminate();
}
The output of the application is "Initialized with OpenGL 0.0". A window briefly opens and closes and the application terminates without errors.
The GL_MAJOR_VERSION and GL_MINOR_VERSION queries were introduced in GL 3.0. Prior to that, this will just generate an GL_INVALID_ENUM error during the glGetIntegerv call, and leave your variables untouched.
You have to use glGetString(GL_VERSION) to reliably get the verison number if you can't make sure that you are on a >= 3.0 context. If you need those as numbers, you'll have to manually parse the string.

SDL2 Wrapper class complains `Invalid Renderer`

I'm trying to write a C++ wrapper class around some SDL2 classes.
Now I have this working code, which displays a red screen for 5 seconds (as you can see, my wrapper classes are in namespace sdl2cc):
int main(void)
{
if (SDL_Init(SDL_INIT_VIDEO) < 0) return 1;
sdl2cc::Window window{"SDL_RenderClear"s, sdl2cc::Rect{sdl2cc::Point{SDL_WINDOWPOS_CENTERED, SDL_WINDOWPOS_CENTERED}, sdl2cc::Dimension{512, 512}}, {}};
sdl2cc::Renderer renderer{window, {}};
renderer.draw_color(sdl2cc::Color{sdl2cc::RGB{255,0,0}, sdl2cc::Alpha{255}});
SDL_RenderClear(renderer.data());
// renderer.clear();
SDL_RenderPresent(renderer.data());
// renderer.present();
SDL_Delay(5000);
SDL_Quit();
}
In the wrapper class of SDL2's SDL_Renderer I have a std::unique_ptr data member renderer_ pointing to an actual SDL_Renderer.
renderer.data() exposes this pointer (return this->renderer_.get();).
I want to get the member functions renderer.clear() and renderer.present() to work. Sadly neither do. This is how they look:
void sdl2cc::Renderer::clear(void)
{
if (SDL_RenderClear(this->data()) < 0)
{
std::cerr << "Couldn't clear rendering target with drawing color:" << ' ' << SDL_GetError() << '\n';
}
}
void sdl2cc::Renderer::present(void)
{
SDL_RenderPresent(this->data());
}
If I just use renderer.clear(), it will print my error message + Invalid renderer.
If I just use renderer.present(), it will show a black screen.
What is wrong?
Why are my own functions and the SDL functions not equivalent?
The problem seems to lie in the function call:
SDL_RenderClear(renderer.data()); // works
// somewhere else:
void sdl2cc::Renderer::clear(SDL_Renderer* r)
{
if (SDL_RenderClear(r) < 0)
{
std::cerr << "Couldn't clear rendering target with drawing color:" << ' ' << SDL_GetError() << '\n';
}
}
renderer.clear(renderer.data()); // doesn't work: Invalid Renderer
But I still don't understand where the problem lies. To me it seems to accomplish the same thing, but somehow one throws an error, the other doesn't.
EDIT:
Another interesting thing, trying to step in at renderer.clear() with lldb goes directly to the next line, without actually stepping in... I don't even.
The problem had to do with multiply linked libraries.
I compiled my own library with the SDL2 libraries and then compiled my executable with my library and the SDL2 libraries.

libGL errors when executing OpenGL program

I get this error when I try to execute my program:
libGL error: unable to load driver: i965_dri.so
libGL error: driver pointer missing
libGL error: failed to load driver: i965
libGL error: unable to load driver: swrast_dri.so
libGL error: failed to load driver: swrast
X Error of failed request: GLXBadFBConfig
Major opcode of failed request: 154 (GLX)
Minor opcode of failed request: 34 ()
Serial number of failed request: 42
Current serial number in output stream: 41
My Code (I took it from the "OpenGL Development Cookbook" book) :
#include <GL/glew.h>
#include <GL/freeglut.h>
#include <iostream>
const int WIDTH = 640;
const int HEIGHT = 480;
void OnInit()
{
glClearColor(1, 0, 0, 0);
std::cout << "Initialization successfull" << std::endl;
}
void OnShutdown()
{
std::cout << "Shutdown successfull" << std::endl;
}
void OnResize(int nw, int nh)
{
}
void OnRender()
{
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glutSwapBuffers();
}
int main(int argc, char** argv)
{
glutInit(&argc, argv);
glutInitDisplayMode(GLUT_DEPTH | GLUT_DOUBLE | GLUT_RGBA);
glutInitContextVersion(3, 3);
glutInitContextFlags(GLUT_CORE_PROFILE | GLUT_DEBUG);
glutInitContextProfile(GLUT_FORWARD_COMPATIBLE);
glutInitWindowSize(WIDTH, HEIGHT);
glutCreateWindow("OpenGL");
glewExperimental = GL_TRUE;
GLenum err = glewInit();
if(GLEW_OK != err) {std::cerr << "Error: " << glewGetErrorString(err) << std::endl; }
else{if(GLEW_VERSION_3_3) {std::cout << "Driver supports OpenGL 3.3\n Details: " << std::endl; }}
std::cout << "\tUsing glew: " << glewGetString(GLEW_VERSION) << std::endl;
std::cout << "\tVendor: " << glGetString(GL_VENDOR) << std::endl;
std::cout << "\tRenderer: " << glGetString(GL_RENDERER) << std::endl;
std::cout << "\tGLSL: " << glGetString(GL_SHADING_LANGUAGE_VERSION) << std::endl;
OnInit();
glutCloseFunc(OnShutdown);
glutDisplayFunc(OnRender);
glutReshapeFunc(OnResize);
glutMainLoop();
return 0;
}
I verified if my driver supports the OpenGL version I am using with the glxinfo | grep "OpenGL" command:
OpenGL vendor string: Intel Open Source Technology Center
OpenGL renderer string: Mesa DRI Intel(R) Sandybridge Mobile
OpenGL core profile version string: 3.3 (Core Profile) Mesa 10.5.9
OpenGL core profile shading language version string: 3.30
OpenGL core profile context flags: (none)
OpenGL core profile profile mask: core profile
OpenGL core profile extensions:
OpenGL version string: 3.0 Mesa 10.5.9
OpenGL shading language version string: 1.30
OpenGL context flags: (none)
OpenGL extensions:
I am using Ubuntu 14.04.3.
I'm not sure but I think I get this error because I am using intel and not Nvidia.
It's hard to tell from a distance, but the errors you have there look like a damaged OpenGL client library installation. glxinfo queries the GLX driver loaded into the Xorg server, which is somewhat independent from the installed libGL (as long as only indirect rendering calls are made). The errors indicate that the installed libGL either doesn't match the DRI drivers or the DRI libraries are damaged.
Either way, the best course of action is to do a clean reinstall of everything related to OpenGL on your system. I.e. do a forced reinstall of xorg-server, xf86-video-…, mesa, libdri… and so on.
I faced a very similar error:
X Error of failed request: GLXBadFBConfig
Major opcode of failed request: 154 (GLX)
Minor opcode of failed request: 34 ()
Serial number of failed request: 42
Current serial number in output stream: 41
Removing the following line solved it:
glutInitContextVersion(3, 3);