I'm trying to initialize GLew 1.10 along with the latest version of GLFW3. I've managed to get GLFW3 to work without a problem, but GLew is being quite difficult. I followed the example from the GLew website on how to initialize, but glewInit() for some reason is undefined and glewGetContext identifier not found / undefined (errors at bottom):
#include <GLew110\glew.h>
#define GLFW_INCLUDE_GLU
#include <GLFW/glfw3.h>
#include <stdlib.h>
#include <stdio.h>
static void error_callback(int error, const char* description)
{
fputs(description, stderr);
}
static void key_callback(GLFWwindow* window, int key, int scancode, int action, int mods)
{
if (key == GLFW_KEY_ESCAPE && action == GLFW_PRESS)
glfwSetWindowShouldClose(window, GL_TRUE);
}
int main(void)
{
GLFWwindow* window;
glfwSetErrorCallback(error_callback);
if (!glfwInit())
exit(EXIT_FAILURE);
window = glfwCreateWindow(800, 600, "Simple example", NULL, NULL);
if (!window)
{
glfwTerminate();
exit(EXIT_FAILURE);
}
GLenum err = glewInit(); // <---- "glewGetContext" is undefined (line 29)
if (GLEW_OK != err)
{
/* Problem: glewInit failed, something is seriously wrong. */
fprintf(stderr, "Error: %s\n", glewGetErrorString(err));
}
glfwMakeContextCurrent(window);
glfwSetKeyCallback(window, key_callback);
while (!glfwWindowShouldClose(window))
{
float ratio;
int width, height;
glfwGetFramebufferSize(window, &width, &height);
ratio = width / (float) height;
glViewport(0, 0, width, height);
glClear(GL_COLOR_BUFFER_BIT);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glOrtho(-ratio, ratio, -1.f, 1.f, 1.f, -1.f);
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
glRotatef((float) glfwGetTime() * 50.f, 0.f, 0.f, 1.f);
glBegin(GL_TRIANGLES);
glColor3f(1.f, 0.f, 0.f);
glVertex3f(-0.6f, -0.4f, 0.f);
glColor3f(0.f, 1.f, 0.f);
glVertex3f(0.6f, -0.4f, 0.f);
glColor3f(0.f, 0.f, 1.f);
glVertex3f(0.f, 0.6f, 0.f);
glEnd();
glfwSwapBuffers(window);
glfwPollEvents();
}
glfwDestroyWindow(window);
glfwTerminate();
exit(EXIT_SUCCESS);
}
TWO ERRORS LISTED:
error C3861: 'glewGetContext': identifier not found (line 29 pointing to glewInit())
IntelliSense: identifier "glewGetContext" is undefined (line 29)
Something is defining the macro GLEW_MX which is used to configure the glew library for multiple rendering contexts. Building with such a configuration requires some special preparation. The following is from http://glew.sourceforge.net/advanced.html (emphasis added):
Multiple Rendering Contexts (GLEW MX)
Starting with release 1.2.0, thread-safe support for multiple
rendering contexts, possibly with different capabilities, is
available. Since this is not required by most users, it is not added
to the binary releases to maintain compatibility between different
versions. To include multi-context support, you have to do the
following:
Compile and use GLEW with the GLEW_MX preprocessor token defined.
For each rendering context, create a GLEWContext object that will be available as long as the rendering context exists.
Define a macro or function called glewGetContext() that returns a pointer to the GLEWContext object associated with the rendering
context from which OpenGL/WGL/GLX calls are issued. This dispatch
mechanism is primitive, but generic.
Make sure that you call glewInit() after creating the GLEWContext object in each rendering context. Note, that the GLEWContext pointer
returned by glewGetContext() has to reside in global or thread-local
memory.
If you don't need multiple rendering contexts, the best thing might be to find out where GLEW_MX is being set and fix it.
Just in case someone has similar issue.
Please try to call glewInit after calling glfwMakeContextCurrent :)
/* Make the window's context current */
glfwMakeContextCurrent(window);
int glewErr = glewInit();
Related
OpenGL draws only the background, the yellow point does not appear. I want to draw it with glBegin and glEnd. The coordinates are variables, because I want to move that point later. Most of the code is just GLFW initialization the function that worries me is the
draw_player function since there the draw call is contained. The Fix I stumbled upon, to use GL_POINTS instead of GL_POINT (in glBegin as argument), does not help (I continue to use it though).
#include <GLFW/glfw3.h>
//#include <stdio.h>
//coordinates
int px, py;
//My not working function
void draw_player()
{
glColor3f(1.0f, 1.0f, 0);
glPointSize(64.0f);
glBegin(GL_POINTS);
glVertex2i(px, py);
glEnd();
}
int main(int argc, char* argv[])
{
GLFWwindow* window;
if (!glfwInit())
return -1;
window = glfwCreateWindow(910, 512, "Raycast", NULL, NULL);
if (!window)
{
glfwTerminate();
return -1;
}
glfwMakeContextCurrent(window);
glClearColor(0.1f, 0.1f, 0.5f, 1.0f);
//setting the coordinates
px = 100;
py = 10;
while (!glfwWindowShouldClose(window))
{
/* Render here */
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
draw_player();
/* Swap front and back buffers */
glfwSwapBuffers(window);
/* Poll for and process events */
glfwPollEvents();
}
glfwTerminate();
return 0;
}
The coordinate (100, 10) is not in the window. You have not specified a projection matrix. Therefore, you must specify the coordinates of the point in the normalized device space (in the range [-1.0, 1.0]).
If you want to specify the coordinates in pixel units, you must define a suitable orthographic projection with glOrtho:
int main(int argc, char* argv[])
{
GLFWwindow* window;
if (!glfwInit())
return -1;
window = glfwCreateWindow(910, 512, "Raycast", NULL, NULL);
if (!window)
{
glfwTerminate();
return -1;
}
glfwMakeContextCurrent(window);
glMatrixMode(GL_PROJECTION);
glOrtho(0, 910.0, 512.0, 0.0, -1.0, 1.0);
glMatrixMode(GL_MODELVIEW);
// [...]
}
I'm gonna cut straight to the chase; I'm writing in c++ and I'm using glew (openGL). A recurring problem however is that when I first open the window and clear it, the 'glClear' doesn't fill the window with white as expected. It starts with an offset from the top left corner. This offset happens to be the exact same offset as the distance between my window and the top left corner of my screen so I suspect something funky about that. However, I haven't found anything online about it for 3 hours, so I decided to ask the question myself. Attached is the code and a screenshot of the problem. Screen capture of the window here, the red lines are the same length and added after the screenshot.
#define GLEW_STATIC
#include <iostream>
#include <GL/glew.h>
#include <GLFW/glfw3.h>
GLFWwindow* window;
int main(int argc, char **argv){
if(!glfwInit()) std::cout << "glfw failed to init" << std::endl;
window = glfwCreateWindow(400, 400, "firstWindow", NULL, NULL);
if(!window) std::cout << "window creation failed" << std::endl;
while (!glfwWindowShouldClose(window)){
if (glfwGetKey(window, GLFW_KEY_ESCAPE) == GLFW_PRESS)
glfwSetWindowShouldClose(window, GL_TRUE);
glClearColor(0.5f, 0.5f, 0.5f, 1.0f);
glClear(GL_COLOR_BUFFER_BIT);
glfwSwapBuffers(window);
glfwPollEvents();
}
glfwMakeContextCurrent(window);
}
UPDATE:
I'm really new to the whole GL thing, I was told that glfwMakeContextCurrent should be called straight after glfwWindowCreate. Once i did this, the whole thing became less wrong. I now get a grey (NOT a white) background, partly atleast. However the problem persists. I cleared glew from the code since it wasn't used.
#include <iostream>
#include <GLFW/glfw3.h>
GLFWwindow* window;
int main(int argc, char **argv){
if(!glfwInit()) std::cout << "glfw failed to init" << std::endl;
window = glfwCreateWindow(400, 400, "firstWindow", NULL, NULL);
if(!window) std::cout << "window creation failed" << std::endl;
glfwMakeContextCurrent(window);
while (!glfwWindowShouldClose(window)){
if (glfwGetKey(window, GLFW_KEY_ESCAPE) == GLFW_PRESS)
glfwSetWindowShouldClose(window, GL_TRUE);
glClearColor(0.5f, 0.5f, 0.5f, 1.0f);
glClear(GL_COLOR_BUFFER_BIT);
glfwSwapBuffers(window);
glfwPollEvents();
}
}
Unlike some other windowing libraries GLFW doesn't leave the GL context it creates for you current after glfwCreateWindow() succeeds.
GL commands don't work too well without a current GL context so you should call glfwMakeContextCurrent() right after glfwCreateWindow(), not after you exit the event loop.
Right now I suspect you're just seeing leftover garbage pixels that the OS/driver didn't bother to clear out.
Also, seeing as you're using GLEW you're going to want to call glewInit() after glfwMakeContextCurrent() so all the GL function-pointers it declares aren't pointing off into nowhere.
As a quick hack, it resolves itself if you resize the window. Using the glfwSetWindowSize(window, width, height) fixes the problem, if the width or height is different from the windows current dimensions. I have no idea why. Again, this is a hack and IS NOT a solution.
#include <iostream>
#include <GLFW/glfw3.h>
GLFWwindow* window;
const int width = 400;
const int height = 400;
int main(int argc, char **argv){
if(!glfwInit()) std::cout << "glfw failed to init" << std::endl;
window = glfwCreateWindow(width, height-1, "firstWindow", NULL, NULL);
if(!window) std::cout << "window creation failed" << std::endl;
glfwMakeContextCurrent(window);
glfwSetWindowSize(window, width, height);
while (!glfwWindowShouldClose(window)){
if (glfwGetKey(window, GLFW_KEY_ESCAPE) == GLFW_PRESS)
glfwSetWindowShouldClose(window, GL_TRUE);
glClearColor(0.5f, 0.5f, 0.5f, 1.0f);
glClear(GL_COLOR_BUFFER_BIT);
glfwSwapBuffers(window);
glfwPollEvents();
}
}
I've tried to follow this tutorial for OpenGL in C but when it comes to the second tutorial, the one that is supposed to draw a triangle on the window, I couldn't see anything. So this is what I did, I took the code that creates the OpenGL context, window and stuff and I tried to make it simpler: instead of using VAO I tried glBegin/glEnd.
I get this error: 1282 "invalid operation". I'm just using the same sentences taking directly from my LWJGL project. The main loop is so simple I can't understand how it does not work and the 1282 error is not giving me any information. Why do I still get an error?
#include <stdio.h>
#include <stdlib.h>
#pragma comment(lib, "glfw3.lib")
#pragma comment(lib, "glew32s.lib")
#pragma comment(lib, "opengl32.lib")
#pragma comment(lib, "glu32.lib")
// Include GLEW. Always include it before gl.h and glfw.h, since it's a bit magic.
#define GLEW_STATIC
#include <GL/glew.h>
// Include GLFW
#include <GLFW/glfw3.h>
// Include GLM
#include <glm/glm.hpp>
using namespace glm;
void checkErrors() {
int error = glGetError();
if (error != GL_NO_ERROR) {
printf("%s (%d)\n", gluErrorString(error), error);
}
}
int main(void)
{
// Initialise GLFW
if (!glfwInit())
{
fprintf(stderr, "Failed to initialize GLFW\n");
return -1;
}
glfwWindowHint(GLFW_SAMPLES, 4); // 4x antialiasing
glfwWindowHint(GLFW_CONTEXT_VERSION_MAJOR, 3); // We want OpenGL 3.3
glfwWindowHint(GLFW_CONTEXT_VERSION_MINOR, 3);
glfwWindowHint(GLFW_OPENGL_FORWARD_COMPAT, GL_TRUE); // To make MacOS happy; should not be needed
glfwWindowHint(GLFW_OPENGL_PROFILE, GLFW_OPENGL_CORE_PROFILE); //We don't want the old OpenGL
// Open a window and create its OpenGL context
GLFWwindow* window; // (In the accompanying source code, this variable is global)
window = glfwCreateWindow(1024, 768, "Tutorial 01", NULL, NULL);
if (window == NULL){
fprintf(stderr, "Failed to open GLFW window. If you have an Intel GPU, they are not 3.3 compatible. Try the 2.1 version of the tutorials.\n");
glfwTerminate();
return -1;
}
glfwMakeContextCurrent(window); // Initialize GLEW
glewExperimental = true; // Needed in core profile
if (glewInit() != GLEW_OK) {
fprintf(stderr, "Failed to initialize GLEW\n");
return -1;
}
// Ensure we can capture the escape key being pressed below
glfwSetInputMode(window, GLFW_STICKY_KEYS, GL_TRUE);
do{
glClear(GL_COLOR_BUFFER_BIT);
glBegin(GL_TRIANGLES);
glVertex3f(-1.0f, -1.0f, 0.0f);
glVertex3f( 1.0f, -1.0f, 0.0f);
glVertex3f( 0.0f, 1.0f, 0.0f);
glEnd();
// Swap buffers
glfwSwapBuffers(window);
glfwPollEvents();
// Check for errors
checkErrors();
} // Check if the ESC key was pressed or the window was closed
while (glfwGetKey(window, GLFW_KEY_ESCAPE) != GLFW_PRESS &&
glfwWindowShouldClose(window) == 0);
return 0;
}
glBegin/glEnd, along with all other immediate mode drawing functions and some more, have been deprecated, and cannot be used with an OpenGL 3.1 (and up) core and forward compatible contexts.
You can try requesting a 3.0 compatibility context (which includes all of the deprecated functionality). To do this, remove the glfwWindowHint(GLFW_OPENGL_FORWARD_COMPAT, GL_TRUE); and glfwWindowHint(GLFW_OPENGL_PROFILE, GLFW_OPENGL_CORE_PROFILE); lines, and change the minor version hint to 0. Indeed, according to the OpenGL wiki, you should not explicitly request a forward compatible context with 3.1 and newer anyway. However, your best bet is to figure out what's wrong with the VAO code instead of mucking around with deprecated functionality.
I'm using Visual Studio 2013, and as I am learning OpenGL 3.3 I thought best to use
#define GLFW_INCLUDE_GLCOREARB
#include <GLFW/glfw3.h>
to force 'intellisense' to not even show old depreciated functions such as glVertex2f etc...
However the inclusion of said #define prevents any gl* functions from showing up. Even glViewport is undefined. When attempting to compile a simple application I get among many errors
error C3861: 'glViewport': identifier not found
glcorearb.h is my include files path though, downloaded from http://www.opengl.org/registry/api/GL/glcorearb.h only yesterday.
I might be doing something completely wrong here. But here is my full source code...
// Include standard headers
#include <stdio.h>
#include <stdlib.h>
#define GLFW_INCLUDE_GLCOREARB
// Include GLFW3
#include <GLFW/glfw3.h>
//Error Callback - Outputs to STDERR
static void error_callback(int error, const char* description)
{
fputs(description, stderr);
}
//Key Press Callback
static void key_callback(GLFWwindow* window, int key, int scancode, int action, int mods)
{
if (key == GLFW_KEY_ESCAPE && action == GLFW_PRESS)
glfwSetWindowShouldClose(window, GL_TRUE);
}
int main(){
GLFWwindow* window;
glfwSetErrorCallback(error_callback);
// Initialise GLFW
if (!glfwInit())
{
fputs("Failed to initialize GLFW\n", stderr);
exit(EXIT_FAILURE);
}
glfwWindowHint(GLFW_SAMPLES, 2); // 2x antialiasing
glfwWindowHint(GLFW_CONTEXT_VERSION_MAJOR, 3); // We want OpenGL 3.3
glfwWindowHint(GLFW_CONTEXT_VERSION_MINOR, 3);
glfwWindowHint(GLFW_OPENGL_PROFILE, GLFW_OPENGL_CORE_PROFILE); //We don't want the old OpenGL
// Open a window and create its OpenGL context
window = glfwCreateWindow(640, 480, "Test", NULL, NULL);
if (!window)
{
glfwTerminate();
exit(EXIT_FAILURE);
}
glfwMakeContextCurrent(window);
glfwSetKeyCallback(window, key_callback);
while (!glfwWindowShouldClose(window))
{
float ratio;
int width, height;
glfwGetFramebufferSize(window, &width, &height);
ratio = width / (float)height;
glViewport(0, 0, width, height);
glClearColor(0.5f, 0.7f, 1.0f, 0.0f);
glClear(GL_COLOR_BUFFER_BIT);
glfwSwapBuffers(window);
glfwPollEvents();
}
glfwDestroyWindow(window);
glfwTerminate();
exit(EXIT_SUCCESS);
}
I compiled the following code:
// Triangle.cpp
// Our first OpenGL program that will just draw a triangle on the screen.
#include <GLTools.h> // OpenGL toolkit
#include <GLShaderManager.h> // Shader Manager Class
#ifdef __APPLE__
#include <glut/glut.h> // OS X version of GLUT
#else
#define FREEGLUT_STATIC
#include <GL/glut.h> // Windows FreeGlut equivalent
#endif
GLBatch triangleBatch;
GLShaderManager shaderManager;
///////////////////////////////////////////////////////////////////////////////
// Window has changed size, or has just been created. In either case, we need
// to use the window dimensions to set the viewport and the projection matrix.
void ChangeSize(int w, int h)
{
glViewport(0, 0, w, h);
}
///////////////////////////////////////////////////////////////////////////////
// This function does any needed initialization on the rendering context.
// This is the first opportunity to do any OpenGL related tasks.
void SetupRC()
{
// Blue background
glClearColor(0.0f, 0.0f, 1.0f, 1.0f );
shaderManager.InitializeStockShaders();
// Load up a triangle
GLfloat vVerts[] = { -0.5f, 0.0f, 0.0f,
0.5f, 0.0f, 0.0f,
0.0f, 0.5f, 0.0f };
triangleBatch.Begin(GL_TRIANGLES, 3);
triangleBatch.CopyVertexData3f(vVerts);
triangleBatch.End();
}
///////////////////////////////////////////////////////////////////////////////
// Called to draw scene
void RenderScene(void)
{
// Clear the window with current clearing color
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT | GL_STENCIL_BUFFER_BIT);
GLfloat vRed[] = { 1.0f, 0.0f, 0.0f, 1.0f };
shaderManager.UseStockShader(GLT_SHADER_IDENTITY, vRed);
triangleBatch.Draw();
// Perform the buffer swap to display back buffer
glutSwapBuffers();
}
///////////////////////////////////////////////////////////////////////////////
// Main entry point for GLUT based programs
int main(int argc, char* argv[])
{
gltSetWorkingDirectory(argv[0]);
glutInit(&argc, argv);
glutInitDisplayMode(GLUT_DOUBLE | GLUT_RGBA | GLUT_DEPTH | GLUT_STENCIL);
glutInitWindowSize(800, 600);
glutCreateWindow("Triangle");
glutReshapeFunc(ChangeSize);
glutDisplayFunc(RenderScene);
GLenum err = glewInit();
if (GLEW_OK != err) {
fprintf(stderr, "GLEW Error: %s\n", glewGetErrorString(err));
return 1;
}
SetupRC();
glutMainLoop();
return 0;
}
But when i try to execute it the program crash, then the debugger gives me the following error:
Excepción no controlada en 0x00000000 en Triangle.exe: 0xC0000005: Infracción de acceso al leer la ubicación 0x00000000.
To use glCreateShader(GL_VERTEX_SHADER) you must be running OpenGL 2.0 or higher. One way to tell is to check the value of GLEW_VERSION_2_0 (after your call to glewInit()). If the value is true then OpenGL 2.0 is supported. Otherwise, you may need to update your graphics driver or use a newer graphics card.
The only way that this:
hVertexShader = glCreateShader(GL_VERTEX_SHADER)
Can get a NULL pointer exception (are you sure it's this line and not the one before it?) is if glCreateShader is NULL. GLTools is part of the OpenGL Superbible volume 5's distribution; it's not a "standard" OpenGL tool, so I can't say much about it.
But you seem to be initializing GLEW. And since you don't directly include the GLEW header, I can only guess that GLTools is including it for you. So GLEW's initialization ought to be carrying over to GLTools.
Check the value of "glCreateShader". Follow GLEW's #define for this function all the way back to the actual variable that GLEW defines, and then check this variable's value. If it is NULL, then you've got problems. Perhaps GLEW's initialization failed.