My code is here, this returns an error: NSGL: Failed to create OpenGL pixel format
the error callback is the standard callback from glfw.
int main(int argc, const char * argv[]) {
glfwSetErrorCallback(error_callback);
if (!glfwInit ()) {
fprintf (stderr, "ERROR: could not start GLFW3\n");
return 1;
}
GLFWwindow* window = glfwCreateWindow (640, 480, "Hello Triangle", NULL, NULL);
glfwMakeContextCurrent (window);
if (!window) {
fprintf (stderr, "\nERROR: could not open window with GLFW3\n");
return -1;
}
// start GLEW extension handler
glewExperimental = GL_TRUE;
glewInit ();
// get version info
const GLubyte* renderer = glGetString (GL_RENDERER); // get renderer string
const GLubyte* version = glGetString (GL_VERSION); // version as a string
printf ("Renderer: %s\n", renderer);
printf ("OpenGL version supported %s\n", version);
// tell GL to only draw onto a pixel if the shape is closer to the viewer
glEnable (GL_DEPTH_TEST); // enable depth-testing
glDepthFunc (GL_LESS); // depth-testing interprets a smaller value as "closer"
/* OTHER STUFF GOES HERE NEXT */
// close GL context and any other GLFW resources
glfwTerminate();
return 0;
}
does someone know what the problem is?
On my OSX Machine, that issue arised due to the stencil buffer depth setting, which was 16 bit. OSX appearently (or the built in graphics cards) can only handle 8 bit. Since I´m not into OpenGL at all, I cannot reason this yet, but I will update the answer as soon as I have a deeper understanding.
The code responsible for setting the buffer depth is the following (corrected version):
glfwWindowHint(GLFW_STENCIL_BITS, 8);
which is executed before creating the window with glfwCreateWindow(...)
Hope this helps :)
Related
There is something strange happening with gl3w's isSupported function. When I call isSupported(4, 0) it returns false, meaning OpenGL 4.0 isn't supported. However, when I call glGetString(GL_VERSION) it says OpenGL version 4.0.
Does this mean I can use OpenGL 4.0 functions?
I'm using gl3w in C++ and Visual Studio 2017
#include <GL/gl3w.h>
#include <GLFW/glfw3.h>
int main(int argc, char** argv){
if(!glfwInit()) {
FATAL_ERROR("Failed to initialise GLFW");
}
glfwSetErrorCallback(glfwErrorCallback);
glfwWindowHint(GLFW_CONTEXT_VERSION_MAJOR, 2);
glfwWindowHint(GLFW_CONTEXT_VERSION_MINOR, 0);
GLFWwindow* window = glfwCreateWindow(640, 480, "OpenGL", nullptr, nullptr);
//If i put glfwMakeContextCurrent here gl3wInit fails
//glfwMakeContextCurrent(window);
if (!window) {
glfwTerminate();
FATAL_ERROR("Window creation failed");
}
if(!gl3wInit()) {} // handle that
glfwMakeContextCurrent(window);
bool support = gl3wIsSupported(4, 0); // returns false
const char* version = glGetString(GL_VERSION); // return "4.0.0"
}
You have to make a GL context current before you call gl3wInit() or regular OpenGL functions otherwise they won't do anything useful.
In the OpenGL wiki you can read:
The GL3W library focuses on the core profile of OpenGL 3 and 4. It
only loads the core entrypoints for these OpenGL versions. It supports
Windows, Mac OS X, Linux, and FreeBSD.
Note: GL3W loads core OpenGL
only by default. All OpenGL extensions will be loaded if the --ext
flag is specified to gl3w_gen.py.
And this is confirmed looking inside the code:
int gl3wIsSupported(int major, int minor)
{
if (major < 3) // <<<<=========== SEE THIS
return 0;
if (version.major == major)
return version.minor >= minor;
return version.major >= major;
}
You are asking with glfwWindowHint for an old 2.0 version. Thus, gl3wIsSupported will return false and gl3wInit will return GL3W_ERROR_OPENGL_VERSION.
For glGetString(GL_VERSION) returning "4.0" means that, yes, you can use that 4.0 version. Ask for it with glfwWindowHint.
I fixed it by switching over to glad instead
if (!glfwInit()) {
FATAL_ERROR("Failed to initialise GLFW");
}
glfwSetErrorCallback(glfwErrorCallback);
glfwWindowHint(GLFW_CONTEXT_VERSION_MAJOR, 3);
glfwWindowHint(GLFW_CONTEXT_VERSION_MINOR, 2);
GLFWwindow* window = glfwCreateWindow(640, 480, "OpenGL", nullptr, nullptr);
if (!window) {
glfwTerminate();
FATAL_ERROR("Window creation failed");
}
glfwMakeContextCurrent(window);
if (!gladLoadGLLoader((GLADloadproc)glfwGetProcAddress)) {
glfwDestroyWindow(window);
glfwTerminate();
FATAL_ERROR("Failed to initialise OpenGL context");
}
PRINT("OpenGL Version: " << GLVersion.major << "." << GLVersion.minor);
I am new to the ImGui library and recently i've been trying out the examples included. Everything worked like a charm until I changed the include (and functions) of gl3w to glad (the loader i would like to use). The moment I swapped between the two loaders I got a segmentation fault exception inside the imgui_impl_glfw_gl3.cpp file. I found a post which suggested that this may happen because of some functions failing to "bind" and producing nullpointers.
I have located the error in line 216 of imgui_impl_glfw_gl3.cpp
this is the code in line 216:
glGetIntegerv(GL_TEXTURE_BINDING_2D, &last_texture);
I have also changed the include file in imgui_impl_glfw_gl3.cpp from gl3w to glad with no results.
This is the main function i am executing (it's the basic opengl3 example of imgui using glad):
#include "gui/imgui.h"
#include "gui/imgui_impl_glfw_gl3.h"
#include <stdio.h>
#include <glad/glad.h> // This example is using gl3w to access OpenGL functions (because it is small). You may use glew/glad/glLoadGen/etc. whatever already works for you.
#include <GLFW/glfw3.h>
static void error_callback(int error, const char* description)
{
fprintf(stderr, "Error %d: %s\n", error, description);
}
int main(int, char**)
{
// Setup window
glfwSetErrorCallback(error_callback);
if (!glfwInit())
return 1;
glfwWindowHint(GLFW_CONTEXT_VERSION_MAJOR, 3);
glfwWindowHint(GLFW_CONTEXT_VERSION_MINOR, 2);
glfwWindowHint(GLFW_OPENGL_PROFILE, GLFW_OPENGL_CORE_PROFILE);
GLFWwindow* window = glfwCreateWindow(1280, 720, "ImGui OpenGL3 example", NULL, NULL);
glfwMakeContextCurrent(window);
glfwSwapInterval(1); // Enable vsync
glfwInit();
// Setup ImGui binding
ImGui_ImplGlfwGL3_Init(window, true);
// Setup style
//ImGui::StyleColorsDark();
ImGui::StyleColorsClassic();
bool show_demo_window = true;
bool show_another_window = false;
bool algo = true;
ImVec4 clear_color = ImVec4(0.45f, 0.55f, 0.60f, 1.00f);
// Main loop
while (!glfwWindowShouldClose(window))
{
glfwPollEvents();
ImGui_ImplGlfwGL3_NewFrame();
// 1. Show a simple window.
// Tip: if we don't call ImGui::Begin()/ImGui::End() the widgets automatically appears in a window called "Debug".
{
static float f = 0.0f;
static int counter = 0;
ImGui::Text("Hello, world!"); // Display some text (you can use a format string too)
ImGui::SliderFloat("float", &f, 0.0f, 1.0f); // Edit 1 float using a slider from 0.0f to 1.0f
ImGui::ColorEdit3("COLORINES", (float*)&clear_color); // Edit 3 floats representing a color
ImGui::Checkbox("Demo Window", &show_demo_window); // Edit bools storing our windows open/close state
ImGui::Checkbox("Booleanooooo", &algo);
ImGui::Checkbox("Another Window", &show_another_window);
if (ImGui::Button("Button")) // Buttons return true when clicked (NB: most widgets return true when edited/activated)
counter++;
ImGui::SameLine();
ImGui::Text("counter = %d", counter);
ImGui::Text("pues se ve que hay texto: %d", algo);
ImGui::Text("Application average %.3f ms/frame (%.1f FPS)", 1000.0f / ImGui::GetIO().Framerate, ImGui::GetIO().Framerate);
}
{
ImGui::Begin("VENTANA WAPA");
ImGui::Text("POS SA QUEDAO BUENA VENTANA");
static float yee = 0.0f;
ImGui::SliderFloat("lel", &yee,1.0f,0.5f);
ImGui::End();
}
// 2. Show another simple window. In most cases you will use an explicit Begin/End pair to name your windows.
if (show_another_window)
{
ImGui::Begin("Another Window", &show_another_window);
ImGui::Text("Hello from another window!");
if (ImGui::Button("Close Me"))
show_another_window = false;
ImGui::End();
}
// 3. Show the ImGui demo window. Most of the sample code is in ImGui::ShowDemoWindow(). Read its code to learn more about Dear ImGui!
if (show_demo_window)
{
ImGui::SetNextWindowPos(ImVec2(650, 20), ImGuiCond_FirstUseEver); // Normally user code doesn't need/want to call this because positions are saved in .ini file anyway. Here we just want to make the demo initial state a bit more friendly!
ImGui::ShowDemoWindow(&show_demo_window);
}
// Rendering
int display_w, display_h;
glfwGetFramebufferSize(window, &display_w, &display_h);
glViewport(0, 0, display_w, display_h);
glClearColor(clear_color.x, clear_color.y, clear_color.z, clear_color.w);
glClear(GL_COLOR_BUFFER_BIT);
ImGui::Render();
glfwSwapBuffers(window);
}
// Cleanup
//ImGui_ImplGlfwGL3_Shutdown();
glfwTerminate();
return 0;
}
I have no clue why this is happenning and I'm pretty new to openGL an ImGui so, any ideas? :(
Glad & gl3w are both extension loader libraries. They generally need to be initialized on a current GL context before use.
The original code called gl3wInit(). Yours is missing any sort of glad init.
Make sure you initialize glad (gladLoadGLLoader((GLADloadproc) glfwGetProcAddress)) after glfwMakeContextCurrent() and before you call any OpenGL functions.
Otherwise all the OpenGL function pointers glad declares will remain NULL. Trying to call NULL function pointers generally doesn't go well for a process.
I tried with following methods -
1. using glew
2. using glut
both almost similar ways as follows -
#include <stdio.h>
#include <stdlib.h>
#include <GL/glew.h>
#include <GLFW/glfw3.h>
int main(int agrc, char **argv)
{
//do windowing related stuff here
if ( !glfwInit())
{
printf("Error: Failed to initialize GLFW\n");
return -1;
}
GLFWwindow* window = glfwCreateWindow(800, 600, "Triangle", NULL, NULL);
if (window == NULL)
{
printf("Failed to create GLFW window\n");
glfwTerminate();
return -1;
}
glfwMakeContextCurrent(window);
glewExperimental = GL_TRUE;
if (glewInit() != GLEW_OK)
{
printf("Error: Failed to initialize GLEW\n");
return -1;
}
printf("GL version: %s\n", glGetString(GL_VERSION));
printf("GL shading language version: %s\n",
glGetString(GL_SHADING_LANGUAGE_VERSION));
}
Question - Is it possible to check GL and GLSL Version with creating native window?
As per my understanding it is necessary to create GL context which is usually done by creating a window, Please tell me alternative without creating window.
According to the OpenGL Wiki FAQ (emphasis mine).
You must create a GL context in order for your GL function calls to make sense. You can't just write a minimal program such as this:
int main(int argc, char **argv)
{
char *GL_version=(char *)glGetString(GL_VERSION);
char *GL_vendor=(char *)glGetString(GL_VENDOR);
char *GL_renderer=(char *)glGetString(GL_RENDERER);
return 0;
}
In the above, the programmer simply wants to get information about this system (without rendering anything) but it simply won't work because no communication has been established with the GL driver. The GL driver also needs to allocate resources with respect to the window such as a backbuffer. Based on the pixelformat you have chosen, there can be a color buffer with some format such as GL_BGRA8. There may or may not be a depth buffer. The depth might contain 24 bits. There might be a 8 bit stencil. There might be an accumulation buffer. Perhaps the pixelformat you have chosen can do multisampling. Up until now, no one has introduced a windowless context.
You must create a window. You must select a pixelformat. You must create a GL context. You must make the GL context current (wglMakeCurrent for Windows and glXMakeCurrent for *nix).
That said, if you're just looking to avoid a temporary window popping up, create a non-visible window, so the end-user doesn't have any idea that you're creating the window. In GLFW, it appears you can do this by setting the window hint GLFW_VISIBLE to false with glfwWindowHint before creating the window. All other windowing systems I've worked with have a similar concept of setting the visibility of a window.
I've been trying to start a new SDL + GLEW + OpenGL project, and the setup has been difficult (MinGW-w64-32 on Windows 8 64-bit with an Intel i7-4700MQ CPU and NVidia GTX 745M GPU).
If I set the GL attributes to be used for context creation to use OpenGL version 4.2, the color and depth bit sizes get set to 0. However, if I request a 2.1 context (which is also the default), I can get the requested bit depths (8 bits for each color, 24 bits for depth). In either case, however, glClearColor has no effect (just a black background).
In both cases, the result of a few glGetString calls is the same- a 4.2 context, suggesting SDL's output is far from correct.
The entirety of the code can be found here, it's mostly boilerplate for a larger project for now. The relevant sections would most likely be
if(SDL_Init(SDL_INIT_EVERYTHING) != 0) {
std::cerr << "Error initializing SDL.\n";
exit(1);
}
SDL_GL_SetAttribute(SDL_GL_ACCELERATED_VISUAL, 1);
SDL_GL_SetAttribute(SDL_GL_CONTEXT_MAJOR_VERSION, 2);
SDL_GL_SetAttribute(SDL_GL_CONTEXT_MINOR_VERSION, 1);
//SDL_GL_SetAttribute(SDL_GL_CONTEXT_FLAGS, SDL_GL_CONTEXT_FORWARD_COMPATIBLE_FLAG);
SDL_GL_SetAttribute(SDL_GL_RED_SIZE, 8);
SDL_GL_SetAttribute(SDL_GL_GREEN_SIZE, 8);
SDL_GL_SetAttribute(SDL_GL_BLUE_SIZE, 8);
SDL_GL_SetAttribute(SDL_GL_ALPHA_SIZE, 8);
SDL_GL_SetAttribute(SDL_GL_DEPTH_SIZE, 24);
SDL_GL_SetAttribute(SDL_GL_DOUBLEBUFFER, 1);
SDL_Window* window = SDL_CreateWindow("RenderSystem", SDL_WINDOWPOS_CENTERED, SDL_WINDOWPOS_CENTERED, 640, 480, SDL_WINDOW_OPENGL|SDL_WINDOW_RESIZABLE);
if(!window) {
std::cerr << "Window creation failed.\n";
SDL_Quit();
exit(2);
}
SDL_GLContext context = SDL_GL_CreateContext(window);
if(!context) {
std::cerr << "OpenGL Context creation failed.\n";
SDL_DestroyWindow(window);
SDL_Quit();
exit(3);
}
SDL_GL_MakeCurrent(window, context);
and
glClearColor(1.0f, 1.0f, 1.0f, 1.0f);
SDL_Event evt;
bool run = true;
while(run) {
SDL_PollEvent(&evt);
switch(evt.type) {
case SDL_KEYDOWN:
if(evt.key.keysym.sym == SDLK_ESCAPE) {
run = false;
}
break;
case SDL_QUIT:
run = false;
break;
}
SDL_GL_SwapWindow(window);
}
I'm using SDL's built-in function SDL_GL_GetAttribute which returns the values that SDL uses to create the context, not the actual context attributes (as I understand it).
This is incorrect, I took a look at the SDL implementation of SDL_GL_GetAttribute (...) (see src/video/SDL_video.c) and it does what I described. You cannot query the values on a core profile context because they are not defined for the default framebuffer.
Here is where the problem comes from:
int
SDL_GL_GetAttribute(SDL_GLattr attr, int *value)
{
// ...
switch (attr) {
case SDL_GL_RED_SIZE:
attrib = GL_RED_BITS;
break;
case SDL_GL_BLUE_SIZE:
attrib = GL_BLUE_BITS;
break;
case SDL_GL_GREEN_SIZE:
attrib = GL_GREEN_BITS;
break;
case SDL_GL_ALPHA_SIZE:
attrib = GL_ALPHA_BITS;
break;
}
// ...
glGetIntegervFunc(attrib, (GLint *) value);
error = glGetErrorFunc();
}
That code actually generates a GL_INVALID_ENUM error on a core profile, and the return value of SDL_GL_GetAttribute (...) should be non-zero as a result.
If you must get meaningful values from SDL_GL_GetAttribute (...) for bit depths, then that means you must use a compatibility profile. SDL2 does not extract this information from the pixel format it selected (smarter frameworks like GLFW do this), but it naively tries to query it from GL.
If not fail me memory, SDL_GL_DEPTH_SIZE has to be the sum of all color channels:
using four color channels:
SDL_GL_SetAttribute (SDL_GL_DEPTH_SIZE, 32);
If you were using 3 color channels would then:
SDL_GL_SetAttribute (SDL_GL_DEPTH_SIZE, 24);
Already had some problems with it, this might be the problem.
Sorry for my english.
I'm starting developing OpenGL 3 (I'm used to 1, so it's quite a change), and I'm using SDL as my windowing/image/sound/event-framework. I have the following code(taken from opengl.org and slightly modified):
#include <stdio.h>
#include <stdlib.h>
/* If using gl3.h */
/* Ensure we are using opengl's core profile only */
#define GL3_PROTOTYPES 1
#include <OpenGL/gl3.h>
#include <SDL2/SDL.h>
#define PROGRAM_NAME "Tutorial1"
/* A simple function that prints a message, the error code returned by SDL,
* and quits the application */
void sdldie(const char *msg)
{
printf("%s: %s\n", msg, SDL_GetError());
SDL_Quit();
exit(1);
}
void checkSDLError(int line = -1)
{
#ifndef NDEBUG
const char *error = SDL_GetError();
if (*error != '\0')
{
printf("SDL Error: %s\n", error);
if (line != -1)
printf(" + line: %i\n", line);
SDL_ClearError();
}
#endif
}
void render(SDL_Window* win){
glClearColor(1.0,0.0,0.0,1.0);
glClear ( GL_COLOR_BUFFER_BIT );
SDL_GL_SwapWindow(win);
}
/* Our program's entry point */
int main(int argc, char *argv[])
{
SDL_Window *mainwindow; /* Our window handle */
SDL_GLContext maincontext; /* Our opengl context handle */
if (SDL_Init(SDL_INIT_VIDEO) < 0) /* Initialize SDL's Video subsystem */
sdldie("Unable to initialize SDL"); /* Or die on error */
/* Request opengl 3.2 context.
* SDL doesn't have the ability to choose which profile at this time of writing,
* but it should default to the core profile */
SDL_GL_SetAttribute(SDL_GL_CONTEXT_MAJOR_VERSION, 3);
SDL_GL_SetAttribute(SDL_GL_CONTEXT_MINOR_VERSION, 2);
/* Turn on double buffering with a 24bit Z buffer.
* You may need to change this to 16 or 32 for your system */
SDL_GL_SetAttribute(SDL_GL_DOUBLEBUFFER, 1);
SDL_GL_SetAttribute(SDL_GL_DEPTH_SIZE, 24);
/* Create our window centered at 512x512 resolution */
mainwindow = SDL_CreateWindow(PROGRAM_NAME, SDL_WINDOWPOS_CENTERED, SDL_WINDOWPOS_CENTERED,
512, 512, SDL_WINDOW_OPENGL | SDL_WINDOW_SHOWN);
if (!mainwindow) /* Die if creation failed */
sdldie("Unable to create window");
checkSDLError(__LINE__);
/* Create our opengl context and attach it to our window */
maincontext = SDL_GL_CreateContext(mainwindow);
checkSDLError(__LINE__);
/* This makes our buffer swap syncronized with the monitor's vertical refresh */
SDL_GL_SetSwapInterval(1);
render(mainwindow);
SDL_Delay(2000);
/* Delete our opengl context, destroy our window, and shutdown SDL */
SDL_GL_DeleteContext(maincontext);
SDL_DestroyWindow(mainwindow);
SDL_Quit();
return 0;
}
And this works well. But if I change this line:
SDL_GL_SetAttribute(SDL_GL_CONTEXT_MINOR_VERSION, 2);
To this:
SDL_GL_SetAttribute(SDL_GL_CONTEXT_MINOR_VERSION, 3);
So it uses OpenGL 3.3 instead of 3.2, I get a EXC_BAD_ACCES error on every single OpenGL call. I do want to use OpenGL 3.3. My Computer:
MacBook Pro retina late 2012
Mac OS X Mountain Lion
Intel i7 2.7 GHz
Intel HD 4000
NVidia GeForce G
Anyone knows wether this is a OS X error, a SDL error, or something wrong with my code? (I know the SDL-code may not be the best, but it was originally SDL 1.2 code)
because OSX only have opengl capabilities up to 3.2.
if you do SDL_GetError, it will tell you what is wrong.
Edit: OSX Mavericks (10.9) now supports 4.1.
OpenGL 3.3 and + is not available on OSX for now.
However OpenGL 4.1 will be available on the latest OSX 10.9 Mavericks... so stay tuned :)