I creating a small renderer using OpenGL and am getting the access violation error only when using that particular function. Other Gl functions work perfectly fine.
What I have tried so far
setting glewExperimental = GL_TRUE before glewInit
using glGenFramebuffersEXT instead of glGenFramebuffers - Same error
Here is the code I am using to initialize SFMLand GLEW
sf::ContextSettings settings;
settings.depthBits = 24;
settings.stencilBits = 8;
settings.majorVersion = 3;
settings.minorVersion = 3;
settings.attributeFlags = sf::ContextSettings::Core;
window = new sf::Window( sf::VideoMode( screenWidth, screenHeight, 32 ), "OpenGL SFML", sf::Style::Titlebar | sf::Style::Close, settings );
glewExperimental = GL_TRUE;
if ( GLEW_OK != glewInit( ) )
{
std::cout << "Failed to initialize GLEW" << std::endl;
return false;
}
System information:
OS: Windows 10
GPU : GTX 1060
GLEW version: 2.1.0
Related
I've got all the libraries properly installed as far as I can tell, but for some reason, glfwWindowCreate winds up returning NULL. I'm on a Dell XPS 15 at the moment, so I'm wondering if this has to do with the fact that I'm probably running this on the integrated graphics since it's not demanding enough for it to spin up the 1050ti. I'm brand new to OpenGL in general so I'm not certain that my code is properly written, so I'll post it here as well:
glewExperimental = true;
if (!glewInit())
{
fprintf(stderr, "Failed to initialize GLEW!\n");
return -1;
}
glfwWindowHint(GLFW_SAMPLES, 4);
glfwWindowHint(GLFW_CONTEXT_VERSION_MAJOR, 4);
glfwWindowHint(GLFW_CONTEXT_VERSION_MINOR, 6);
glfwWindowHint(GLFW_OPENGL_FORWARD_COMPAT, GL_TRUE);
glfwWindowHint(GLFW_OPENGL_PROFILE, GLFW_OPENGL_CORE_PROFILE);
GLFWwindow* window;
window = glfwCreateWindow(1920, 1080, "Test Window", NULL, NULL);
if (window == NULL)
{
fprintf(stderr, "Failed to initialize the window.");
std::cin.ignore();
glfwTerminate();
return -1;
}
glfwMakeContextCurrent(window);
glewExperimental = true;
if (glewInit() != GLEW_OK)
{
fprintf(stderr, "Failed to initialize GLEW!");
return -1;
}
std::cin.ignore();
std::cin.ignore();
I've just updated my NVIDIA drivers to the latest update, so it's (probably) not that I hope. Unfortunately, I just can't seem to get it to open a window.
You missed to initialize the GLFW libraray. GLFW has to be initialized by glfwInit, before it is used.
The GLEW libraray has to be initialized after a valid OpenGL context has been created and become current. See Initializing GLEW.
Change your code somehow like this, to solve your issue:
if ( glfwInit() != GLFW_TRUE ) // intialize GLFW
{
// error handling
}
glfwWindowHint(GLFW_SAMPLES, 4);
glfwWindowHint(GLFW_CONTEXT_VERSION_MAJOR, 4);
glfwWindowHint(GLFW_CONTEXT_VERSION_MINOR, 6);
glfwWindowHint(GLFW_OPENGL_FORWARD_COMPAT, GL_TRUE);
glfwWindowHint(GLFW_OPENGL_PROFILE, GLFW_OPENGL_CORE_PROFILE);
GLFWwindow* window;
window = glfwCreateWindow(1920, 1080, "Test Window", NULL, NULL);
if (window == NULL)
{
// error handling
}
glfwMakeContextCurrent(window);
// now the OpenGL context is valid and current
glewExperimental = true;
if (glewInit() != GLEW_OK) // initialize GLEW
{
// error handling
}
If on Windows, Optimus-enabled driver would look for an exported variable. That is application have to export it to be accessible to other modules. E.g:
extern "C" {
_declspec(dllexport) DWORD NvOptimusEnablement = 0x00000001;
}
Value 1 would mean usage of high performance graphics. 0 or lack of any export would mean use of low profile.
Now, if You're on MacOs or Linux or problem might be somewhere else.. MAc doesn't like Core profiles... On linux you might forgot to disable kernel modesetting and default opensource driver.
Maybe it's just me being dumb, but I'm sure this should work.
#include <SDL2/SDL.h>
#include <GL/glew.h>
struct Display
{
SDL_Window* window;
SDL_GLContext context;
};
Display* init()
{
SDL_Init( SDL_INIT_EVERYTHING );
SDL_GL_SetAttribute( SDL_GL_CONTEXT_MAJOR_VERSION, 3 );
SDL_GL_SetAttribute( SDL_GL_CONTEXT_MINOR_VERSION, 3 );
SDL_GL_SetAttribute( SDL_GL_CONTEXT_PROFILE_MASK , SDL_GL_CONTEXT_PROFILE_CORE );
SDL_Window* window = SDL_CreateWindow( "Ice Engine",
800, 600,
SDL_WINDOWPOS_CENTERED,
SDL_WINDOWPOS_CENTERED,
SDL_WINDOW_OPENGL );
if ( !window )
{
printf( "%s\n", SDL_GetError() );
return nullptr;
}
SDL_GLContext context = SDL_GL_CreateContext( window );
if ( !context )
{
printf( "%s\n", SDL_GetError() );
return nullptr;
}
glewExperimental = GL_TRUE;
if ( glewInit() != GLEW_OK )
return nullptr;
return new Display{ window, context };
}
int main( int argc, char** argv )
{
Display* display = init();
bool running = true;
SDL_Event e;
while( running )
{
while( SDL_PollEvent( &e ) )
if ( e.type == SDL_QUIT )
running = false;
SDL_GL_SwapWindow( display->window );
}
delete display;
SDL_Quit();
}
I probably shouldn't be using new and delete and such, but this was just a quick setup to get my project going. The problem is it compiles just fine, but when I run it I get this error:
X Error of failed request: BadValue (integer parameter out of range for operation)
Major opcode of failed request: 1 (X_CreateWindow)
Value in failed request: 0x0
Serial number of failed request: 155
Current serial number in output stream: 168
I've tried without setting the OpenGL context versions, but I just get the same error.
I tried switching to GLFW3 and it all works just fine. It creates me a window and a OpenGL 3.3 core profile context. So it seems to be a problem with SDL2. I'm running ubuntu 15.10 and I installed it via the command line with: sudo apt install libsdl2-dev.
You are calling SDL_CreateWindow incorrectly. You have mixed up the x, y and width, height settings. The right way would be:
SDL_Window* window = SDL_CreateWindow("Ice Engine",
SDL_WINDOWPOS_CENTERED,
SDL_WINDOWPOS_CENTERED,
800,
600,
SDL_WINDOW_OPENGL );
See SDL_CreateWindow reference. Other than that, your code looks fine.
I have a problem with SDL lib. I'm using VS2012 Ultimate and i was actually using this tutorial: http://lazyfoo.net/tutorials/SDL/01_hello_SDL/index2.php to set everything and i did it step by step few times, but I still have problems this is my code, very simple:
#include <iostream>
#include <SDL.h>
SDL_Surface * ekran = NULL;
int main (int argc, char *args [] )
{
SDL_Init( SDL_INIT_EVERYTHING );
ekran = SDL_SetVideoMode( 640, 480, 32, SDL_SWSURFACE );
SDL_Flip( ekran );
SDL_Delay( 2000 );
SDL_Quit();
return 0;
}
and im having this errors:
error C3861: 'SDL_SetVideoMode': identifier not found
error C3861: 'SDL_Flip': identifier not found
Here below is an example how to replace SDL_SetVideoMode() in SDL2. The old way to init SDL is commented and left along with the new way for comparison purposes. Basically, SDL2 creates a window with a title, then a surface attached to it, while SDL1 creates a surface alone and then calls the window manager to give it a name.
if (SDL_Init(SDL_INIT_VIDEO) < 0) {
fprintf(stderr, "SDL video init failed: %s\n", SDL_GetError());
return 1;
}
// SDL_Surface *screenSurface = SDL_SetVideoMode(SCREEN_WIDTH, SCREEN_HEIGHT, 32, SDL_SWSURFACE);
SDL_Window* window = NULL;
SDL_Surface* screenSurface = NULL;
window = SDL_CreateWindow("Sphere Rendering",
SDL_WINDOWPOS_UNDEFINED, SDL_WINDOWPOS_UNDEFINED,
SCREEN_WIDTH, SCREEN_HEIGHT, SDL_WINDOW_SHOWN);
if (window == NULL) {
fprintf(stderr, "Window could not be created: %s\n", SDL_GetError());
return 1;
}
screenSurface = SDL_GetWindowSurface(window);
if (!screenSurface) {
fprintf(stderr, "Screen surface could not be created: %s\n", SDL_GetError());
SDL_Quit();
return 1;
}
// SDL_WM_SetCaption("Sphere Rendering", NULL);
Take a look at that tutorial page again. Your code does not match it (e.g. SDL_SetVideoMode() no longer exists). Your code uses SDL 1.2 and the (updated) tutorial uses SDL 2.0. Are you using an old cached version of that page?
I ran into a strange problem recently. I wrote an application class which uses a very simple renderer to draw some models on screen. The camera is movable.
I ran the program on my laptop. Initially I noticed that nothing was being drawn on screen (the screen was being cleared by the correct color, however). Then I noticed that the screen would update itself IF I clicked on the decoration frame and moved the window: this way, the models became visible, but would not move unless I clicked and moved the decoration frame again.
I tested my program on a desktop computer, and everything worked fine; the camera moved smoothly.
Eventually, I got the program to work on my laptop, but I have to set SDL_GL_SetAttribute( SDL_GL_DOUBLEBUFFER, 0 ); and disable buffer swapping.
Below is the main application class. In the execution loop, I call the application state stack to loop & render (the application state actually owns the renderer).
In case it is of any consequence, my laptop has intel HD 4000 graphics, and the desktop has a GTX 670.
App::App() : _running( false ),
_deltaTime( 0u ),
_elapsedTime( 0u ),
_mainWindow( nullptr ),
_glContext(),
_stack() {
//ctor
}
App::~App() {
SDL_GL_DeleteContext( _glContext );
SDL_DestroyWindow( _mainWindow );
SDL_Quit();
}
void App::execute() {
_initialize();
static const float millisecondsPerFrame = 17;
while ( _running ) {
//get the delta time & update elapsed time
uint32_t oldTime = _elapsedTime;
_elapsedTime = SDL_GetTicks();
_deltaTime = _elapsedTime - oldTime;
_processEvents();
_loop( _deltaTime / 1000.0f );
_render();
//apply possible state changes made to the stack
_stack.applyPendingChanges();
int usedTime = SDL_GetTicks() - int ( _elapsedTime );
//sleep the remainder of the cycle if we didn't use the entire update cycle
if ( millisecondsPerFrame - usedTime > 0 ) {
SDL_Delay( uint32_t ( millisecondsPerFrame - usedTime ) );
}
}
}
void App::_initialize() {
//initialize random number generator
nge::srand();
_running = true;
_initializeSDL();
_initializeOpenGL();
SDL_GL_MakeCurrent( _mainWindow, _glContext );
//attempt to set late swap tearing
int res = SDL_GL_SetSwapInterval( -1 );
//returns 0 on success
//returns -1 if swap interval is not supported
if ( res == -1 ) {
std::cout << "App::_initializeSDL> " << SDL_GetError() << "\n\n";
SDL_GL_SetSwapInterval( 1 );
}
_stack.registerState<GameState>( AppStateID::Game );
_stack.pushState( AppStateID::Game );
_stack.applyPendingChanges();
}
void App::_initializeSDL() {
SDL_Init( SDL_INIT_VIDEO );
SDL_Init( SDL_INIT_TIMER );
SDL_GL_SetAttribute( SDL_GL_CONTEXT_MAJOR_VERSION, 3 );
SDL_GL_SetAttribute( SDL_GL_CONTEXT_MINOR_VERSION, 3 );
SDL_GL_SetAttribute( SDL_GL_CONTEXT_PROFILE_MASK,
SDL_GL_CONTEXT_PROFILE_CORE );
SDL_GL_SetAttribute( SDL_GL_ACCELERATED_VISUAL, 1 );
/**
For some reason, on my Samsung Series 9, double buffering does not
work.
*/
SDL_GL_SetAttribute( SDL_GL_DOUBLEBUFFER, 0 );
SDL_GL_SetAttribute( SDL_GL_DEPTH_SIZE, 16 );
SDL_GL_SetAttribute( SDL_GL_STENCIL_SIZE, 8 );
//anti-aliasing
SDL_GL_SetAttribute( SDL_GL_MULTISAMPLEBUFFERS, 1 );
SDL_GL_SetAttribute( SDL_GL_MULTISAMPLESAMPLES, 4 );
_mainWindow = SDL_CreateWindow( "window",
SDL_WINDOWPOS_UNDEFINED,
SDL_WINDOWPOS_UNDEFINED,
800,
600,
SDL_WINDOW_OPENGL |
SDL_WINDOW_RESIZABLE |
SDL_WINDOW_MAXIMIZED |
SDL_WINDOW_SHOWN );
_glContext = SDL_GL_CreateContext( _mainWindow );
}
void App::_initializeOpenGL() {
//initialize GLEW
glewExperimental = GL_TRUE;
if ( glewInit() != GLEW_OK ) {
std::cerr << "glewInit failed." << std::endl;
std::exit( EXIT_FAILURE );
}
glEnable( GL_DEPTH_TEST );
//enable culling
glEnable( GL_CULL_FACE );
glCullFace( GL_BACK );
glDepthFunc( GL_LEQUAL );
glEnable( GL_TEXTURE_CUBE_MAP_SEAMLESS );
std::cout << "OpenGL version: " << glGetString( GL_VERSION ) << std::endl;
std::cout << "GLSL version: " << glGetString( GL_SHADING_LANGUAGE_VERSION ) << std::endl;
std::cout << "Vendor: " << glGetString( GL_VENDOR ) << std::endl;
std::cout << "Renderer: " << glGetString( GL_RENDERER ) << std::endl << std::endl;
//make sure OpenGL 3.3 is available
ASSERT( GLEW_VERSION_3_3, "OpenGL 3.3 API is not available" );
}
void App::_processEvents() {
SDL_Event event;
while ( SDL_PollEvent( &event ) ) {
if ( event.type == SDL_QUIT ) {
_running = false;
}
}
}
void App::_loop( float delta ) {
_stack.loop( delta );
}
void App::_render() {
_stack.render();
//SDL_GL_SwapWindow( _mainWindow );
}
The first thing that I would check are GPU drivers on the laptop. Make sure that the drivers version matches the drivers version on the desktop.
Second thing is to add error printing. From here :
window = SDL_CreateWindow("OpenGL Window", SDL_WINDOWPOS_CENTERED, SDL_WINDOWPOS_CENTERED, 640, 480, SDL_WINDOW_OPENGL);
if (!window) {
fprintf(stderr, "Couldn't create window: %s\n", SDL_GetError());
return;
}
context = SDL_GL_CreateContext(window);
if (!context) {
fprintf(stderr, "Couldn't create context: %s\n", SDL_GetError());
return;
}
Third thing to check are requested buffers. Maybe the GPU or drivers do not support double buffering, or depth size of 16 bits, or some other parameter that you requested. So, play with parameters in the initializeSDL() function, and find the one that works on your laptop.
I've been trying to get GLew 1.10 to play nicely with SDL 2.0.3, but GLew won't initialize.
The problem I'm having is that GLew 1.10 requires a function GLEWContext* glewGetContext().
I've tried to use a the same solution used for GLew 1.10 with GLFW3, where you use a struct to handle the window and GLew context, but that method doesn't work with SDL2.
The 2 errors I'm receiving is this which points to glewInit():
C3861: 'glewGetContext': identifier not found
Intellisense: identifier "glewGetContext is undefined
code:
// Create window
_screen = SDL_CreateWindow("Window", SDL_WINDOWPOS_CENTERED, SDL_WINDOWPOS_CENTERED,
800, 600, SDL_WINDOW_OPENGL | SDL_WINDOW_SHOWN);
/* Create Context */
_mainContext = SDL_GL_CreateContext(_screen);
/* swap syncronized */
SDL_GL_SetSwapInterval(0);
// Initialize GLew 1.10
glewExperimental = GL_TRUE;
GLenum glewError = glewInit(); <------------- error
if (glewError != GLEW_OK)
printf("Error with GLew. SDL Error: %s\n", SDL_GetError());