glfw program crashing in WSL2 - c++

I'm trying to run the example GLFW program on WSL2 with vcxsrv as the X server. There are no problems running the current program with the two lines commented out. When I uncomment only glCear(CL_COLOR_BUFFER_BIT);, it lags a bit when I drag the window around. When I uncomment only glfwSwapBuffers(window) the program crashes as soon as the window pops up, where dragging the window around becomes very laggy, and I am unable to close the window without terminating the app through the terminal. Why is this happening? Is there any way to improve the speed of these programs on WSL2?
Program:
#include <iostream>
#include <GLFW/glfw3.h>
static void glfwError(int /*id*/, const char* description)
{
std::cerr << description << '\n';
}
int main(void)
{
GLFWwindow* window;
glfwSetErrorCallback(&glfwError);
/* Initialize the library */
if (!glfwInit())
return -1;
/* Create a windowed mode window and its OpenGL context */
window = glfwCreateWindow(640, 480, "Hello World", NULL, NULL);
if (!window)
{
glfwTerminate();
return -1;
}
/* Make the window's context current */
glfwMakeContextCurrent(window);
/* Loop until the user closes the window */
while (!glfwWindowShouldClose(window))
{
/* Render here */
// glClear(GL_COLOR_BUFFER_BIT);
/* Swap front and back buffers */
// glfwSwapBuffers(window);
/* Poll for and process events */
glfwPollEvents();
}
glfwTerminate();
return 0;
}

Related

GLFW WSl2: Display Environment Variable missing

I've installed WSL2 on Windows 10 along with GLFW. When I try to run the default GLFW example program, I run into the following error X11: The DISPLAY environment variable is missing. I build the program using this command-line:
g++ main.cpp -lglfw -lGL -o main
Program:
#include <stdio.h>
#include <iostream>
#include <GLFW/glfw3.h>
static void glfwError(int /*id*/, const char* description)
{
std::cerr << description << '\n';
exit(0);
}
int main(void)
{
GLFWwindow* window;
glfwSetErrorCallback(&glfwError);
/* Initialize the library */
if (!glfwInit())
return -1;
/* Create a windowed mode window and its OpenGL context */
window = glfwCreateWindow(640, 480, "Hello World", NULL, NULL);
if (!window)
{
glfwTerminate();
return -1;
}
/* Make the window's context current */
glfwMakeContextCurrent(window);
/* Loop until the user closes the window */
while (!glfwWindowShouldClose(window))
{
/* Render here */
glClear(GL_COLOR_BUFFER_BIT);
/* Swap front and back buffers */
glfwSwapBuffers(window);
/* Poll for and process events */
glfwPollEvents();
}
glfwTerminate();
return 0;
}
How can I fix this issue?

Why do I need to use #define in code blocks but not on visual studio? (GLFW)

I'm starting to learn openGL. I prefer to use codeblocks but the setup for GLFW3 is weird in codeblocks. I copied the example code from the documentation of glfw which was for visual studio. But in code blocks I needed to add an extra line #define GLFW_DLL to make it work?
#define GLFW_DLL
#include <GLFW/glfw3.h>
#include <iostream>
int main(void)
{
GLFWwindow* window;
/* Initialize the library */
if (!glfwInit())
return -1;
/* Create a windowed mode window and its OpenGL context */
window = glfwCreateWindow(640, 480, "Hello World", NULL, NULL);
if (!window)
{
glfwTerminate();
return -1;
}
/* Make the window's context current */
glfwMakeContextCurrent(window);
/* Loop until the user closes the window */
while (!glfwWindowShouldClose(window))
{
/* Render here */
glClear(GL_COLOR_BUFFER_BIT);
/* Swap front and back buffers */
glfwSwapBuffers(window);
/* Poll for and process events */
glfwPollEvents();
}
glfwTerminate();
return 0;
}
so why do I need to add an extra #define line in codeblocks and not in visual studio to make it work?
I followed this instruction
GLFW documentation example code here

opengl initialization problem with glew and glfw

I'm starting to learn the OpenGL stuff, but unfortunately, I can't make the initialization right.
I've added the glfw and glew libraries and these functions give me a strange error,
how can I make it work?
The code:
#include <iostream>
#include <GL/glew.h>
#include <GLFW/glfw3.h>
int main(void)
{
GLFWwindow* window;
/* Initialize the library */
if (!glfwInit())
{
std::cout << "GLFW initialization failed.\n";
return -1;
}
if (glewInit()!=GLEW_OK)
{
std::cout << "GLEW initialization failed.\n";
return -1;
}
window = glfwCreateWindow(640, 480, "Hello World", NULL, NULL);
if (!window)
{
glfwTerminate();
std::cout << "Wiondow failed.\n";
return -1;
}
/* Make the window's context current */
glfwMakeContextCurrent(window);
/* Loop until the user closes the window */
while (!glfwWindowShouldClose(window))
{
/* Render here */
glClear(GL_COLOR_BUFFER_BIT);
/* Swap front and back buffers */
glfwSwapBuffers(window);
/* Poll for and process events */
glfwPollEvents();
}
glfwTerminate();
return 0;
}
The errors:
To link the GLEW library correctly, proper preprocessor definitions have to be set. See GLEW - Installation:
[...] On Windows, you also need to define the GLEW_STATIC preprocessor token when building a static library or executable, and the GLEW_BUILD preprocessor token when building a dll [...]

Visual Studio 2017 OpenGL starts very slowly

I'm new to learning OpenGL in fact I just started following along a great resource called "learningopengl", This resources teaches how to setup GLFW and GLAD to use OpenGL on visual studio 2017. However, I've noticed when running even just an extremely basic program that only builds a blank window, that can be resized (following along the learning material) it takes visual studio consistently 40 or so seconds to actually load the program. it runs fine once its up, it simply takes an annoyingly long time to open.
I know this isn't supposed to take so long because through my searching around and looking at videos of people using these same libs/includes when they run there sample programs they load up instantly.
I was hoping I could find out why this is the case. for reference the sample code I'm running is show below.
#include <glad/glad.h>
#include <GLFW\glfw3.h>
#include <iostream>
void framebuffer_size_callback(GLFWwindow* window, int width, int height);
int main() {
// glfw: initialize and configure
// -----------------------------
glfwInit();
glfwWindowHint(GLFW_CONTEXT_VERSION_MAJOR, 3);
glfwWindowHint(GLFW_CONTEXT_VERSION_MINOR, 3);
glfwWindowHint(GLFW_OPENGL_PROFILE, GLFW_OPENGL_CORE_PROFILE);
// glfw window creation
// --------------------
GLFWwindow* window = glfwCreateWindow(800, 600, "LearnOpenGL", NULL, NULL);
if (window == NULL) {
std::cout << "Failed to create GLFW window" << std::endl;
glfwTerminate();
return -1;
}
glfwMakeContextCurrent(window);
glfwSetFramebufferSizeCallback(window, framebuffer_size_callback);
// glad: load all openGL function pointers
// ---------------------------------------
if (!gladLoadGLLoader((GLADloadproc)glfwGetProcAddress)) {
std::cout << "Failed to initialize GLAD" << std::endl;
return -1;
}
// render loop
// -----------
while (!glfwWindowShouldClose(window)) {
// glfw: swap buffers and poll IO events (keys pressed/released, mouse moved etc
// -----------------------------------------------------------------------------
glfwSwapBuffers(window);
glfwPollEvents();
}
// glfw: terminate, clearing all perviously allocated glfw recourses
// -----------------------------------------------------------------
glfwTerminate();
return 0;
}
// glfw: whenever the window size is changed (by OS or user resize) this callback function executes
// ------------------------------------------------------------------------------------------------
void framebuffer_size_callback(GLFWwindow* window, int width, int height) {
// make sure the viewport matches the new window dimensions; note that width and
// height will be significantly larger then specified on retina displays.
glViewport(0, 0, width, height);
}
Just a finial clarification, I do not mean that any kind of frame rate is slow, I mean from the time I tell visual studio to run the project (Ctrl+f5 or run the debugger) to the time I can see the actual OpenGL window i'm trying to make takes approximately 40 seconds or more.
Also when I run the code a cmd prompt runs behind it I was also wondering why that happens and if there is a way to prevent it or if it happens because of a setting.

crazy flashing window OpenGL GLFW

I have completed the following video
https://www.youtube.com/watch?v=shpdt6hCsT4
however, my hello world window looked like this:
http://s1303.photobucket.com/user/eskimo___/media/screenshot_124_zps890ae561.jpg.html
any ideas where I've gone wrong? Im using osx yosemite with the latest GLFW
cheers
as requested:
my project folder is comprised of 3 files which were made as part of the process using the terminal:
main.cpp(C++ source code)
Makefile(txt)
test(Unix Executable File)
i've set up the glfw3 library on my mac using homebrew.
Main.cpp, which is what is run to produce the undesired effect in the window pictured, is comprised of the example code at GLFW's documentation part of the website:
include <GLFW/glfw3.h>
int main(void)
{
GLFWwindow* window;
/* Initialize the library */
if (!glfwInit())
return -1;
/* Create a windowed mode window and its OpenGL context */
window = glfwCreateWindow(640, 480, "Hello World", NULL, NULL);
if (!window)
{
glfwTerminate();
return -1;
}
/* Make the window's context current */
glfwMakeContextCurrent(window);
/* Loop until the user closes the window */
while (!glfwWindowShouldClose(window))
{
/* Render here */
/* Swap front and back buffers */
glfwSwapBuffers(window);
/* Poll for and process events */
glfwPollEvents();
}
glfwTerminate();
return 0;
}
Insert glClear(GL_COLOR_BUFFER_BIT) prior to glfwSwapBuffers - it's basically updating from uninitialized 'framebuffer' memory, which the GL implementation has probably used for other purposes, like backing store, textures, etc., in the Quartz compositor.
Think of it as like a call to malloc; the memory isn't required to be initialized or zeroed.