GL/glx isn't linking correctly - c++

System Specs and task
I am using Code::Blocks on Ubuntu 10.10 and playing around with OpenGL and glx. I'm in the process of learning C++(from a background in C and Java), so the style of any code doesn't conform to any real good standards (but I'm open to suggestions on how to improve, even if you don't have an answer to the question)
Edit:
Huge Realization: The default OpenGL Project Code::Blocks creates is C, not C++. I'm looking into this now.
I'm currently trying to modify the default OpenGL project on Code::Blocks into a simple 3d engine. I am currently getting the error:
expected ‘=’, ‘,’, ‘;’, ‘asm’ or ‘__attribute__’ before 'Draw'
Which disappears as soon as I comment out the #include for < GL/glx.h>
I read on a forum somewhere that Code::Blocks doesn't look in usr/include/ by default, but I added that to the search directories for the compiler in the project build options and it didn't seem to fix anything.
Code:
main.cpp: main.c:
#include <time.h>
#include "Draw.h"
#include <stdio.h>
#include <stdlib.h>
int main(int argc, char **argv)
{
/*draw init here*/
Draw::Draw renderer = Draw::Draw.getDraw();
printf( "Press left mouse button to rotate around X axis\n" );
printf( "Press middle mouse button to rotate around Y axis\n" );
printf( "Press right mouse button to rotate around Z axis\n" );
printf( "Press ESC to quit the application\n" );
/* timing variable*/
/* Set it to delay half a second before rendering the first frame*/
clock_t flip_time = clock() + 0.5f * CLOCKS_PER_SEC;
while (1)
{
/* Update models */
/* Draw scene */
/* wait until it's been 1/60th of a second*/
while(clock() < flip_time){}
flip_time = clock() + (1.0f/60.0f) * CLOCKS_PER_SEC;
/* Actually flip the frame */
}
}
Draw.h:
#ifndef DRAW_H
#define DRAW_H
#include <GL/glx.h> /* This is the problem line */
#include <GL/gl.h>
#include <X11/X.h> /* X11 constant (e.g. TrueColor) */
#include <X11/keysym.h>
class Draw
{
public:
static Draw getDraw();
virtual ~Draw();
void update();
void render();
protected:
private:
Draw();
bool init();
/* The singleton*/
static Draw *instance;
static bool exists;
/* X Window values */
Display *dpy;
Window win;
GLboolean doubleBuffer;
/* X Parameters*/
XVisualInfo *vi;
Colormap cmap;
XSetWindowAttributes swa;
GLXContext cx;
XEvent event;
int dummy;
};
#endif // DRAW_H
Last, but not least Draw.cpp:
#include "Draw.h"
/* Set up the singleton*/
bool Draw::exists = false;
Draw* Draw::instance = NULL;
Draw::Draw()
{
/*TODO: make this constructor */
}
Draw::~Draw()
{
//dtor
}
Draw Draw::getDraw()
{
if(!exists)
{
instance = new Draw();
instance->init();
exists = true; //Thanks mat, This line was accidentally removed with extraneous comments
}
return *instance;
}
bool Draw::init()
{
/* Get the buffers ready */
static int snglBuf[] = {GLX_RGBA, GLX_DEPTH_SIZE, 16, None};
static int dblBuf[] = {GLX_RGBA, GLX_DEPTH_SIZE, 16, GLX_DOUBLEBUFFER, None};
/* Double Buffered is best*/
doubleBuffer = GL_TRUE;
/*TODO: add constructor if it hasn't been constructed already*/
dpy = XOpenDisplay(NULL);
if (dpy == NULL)
{
return false;
}
/* make sure OpenGL's GLX extension supported */
if(!glXQueryExtension(dpy, &dummy, &dummy))
{
return false;
}
/* find an appropriate visual */
/* find an OpenGL-capable RGB visual with depth buffer */
vi = glXChooseVisual(dpy, DefaultScreen(dpy), dblBuf);
if (vi == NULL)
{
vi = glXChooseVisual(dpy, DefaultScreen(dpy), snglBuf);
if (vi == NULL)
{
return false;
}
doubleBuffer = GL_FALSE;
}
/*
TODO: Fix or remove this
if(vi->class != TrueColor)
{
return false;
}
*/
/* create an OpenGL rendering context */
/* create an OpenGL rendering context */
cx = glXCreateContext(dpy, vi, /* no shared dlists */ None,
/* direct rendering if possible */ GL_TRUE);
if (cx == NULL)
{
return false;
}
/* create an X window with the selected visual */
/* create an X colormap since probably not using default visual */
cmap = XCreateColormap(dpy, RootWindow(dpy, vi->screen), vi->visual, AllocNone);
swa.colormap = cmap;
swa.border_pixel = 0;
swa.event_mask = KeyPressMask | ExposureMask
| ButtonPressMask | StructureNotifyMask;
win = XCreateWindow(dpy, RootWindow(dpy, vi->screen), 0, 0,
300, 300, 0, vi->depth, InputOutput, vi->visual,
CWBorderPixel | CWColormap | CWEventMask, &swa);
XSetStandardProperties(dpy, win, "main", "main", None,
NULL, NULL, NULL);
/* bind the rendering context to the window */
glXMakeCurrent(dpy, win, cx);
/* request the X window to be displayed on the screen */
XMapWindow(dpy, win);
/* configure the OpenGL context for rendering */
glEnable(GL_DEPTH_TEST); /* enable depth buffering */
glDepthFunc(GL_LESS); /* pedantic, GL_LESS is the default */
glClearDepth(1.0); /* pedantic, 1.0 is the default */
/* frame buffer clears should be to black */
glClearColor(0.0, 0.0, 0.0, 0.0);
/* set up projection transform */
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glFrustum(-1.0, 1.0, -1.0, 1.0, 1.0, 10.0);
/* establish initial viewport */
/* pedantic, full window size is default viewport */
glViewport(0, 0, 300, 300);
return true;
}
void Draw::update()
{
/*TODO: Add things to draw here*/
}
void Draw::render()
{
/* actually flip buffers here */
}
I removed a ton of comments before posting this here, but that shouldn't affect whether or not it compiles.
Thanks!

This line in your main file is wrong:
Draw::Draw renderer = Draw::Draw.getDraw();
Draw::Draw is not a type. To get this to compile, you just need:
Draw renderer = Draw.getDraw();
It looks like you're trying to build a singleton. You code does not do that at all, you'll get a copy each time. (Note that you're not setting exists anywhere, but that's just an extra bug.) You should be returning a pointer or a reference to the shared instance. See for instance this article to get the syntax right: C++ Singleton design pattern.

I found the issue with the linking.
The default project for OpenGL in Code::Blocks is NOT C++, it's C. I configured it to use g++ and it fixed the issue with glx not linking in correctly. I revised my singleton to look a little more like this and it works correctly now too. I have an issue now with the window not appearing, but I should be able to figure that out.
Thanks!

Related

OpenGL GLFW not correctly recognising window resize

I have some graph that I need to resize concurrently when the window is also resized, however as of now, it changes the axis size's perfectly when resizing but when I let go of the resize (stop clicking on the corner of a window) the application's graph limits revert back to something entirely incorrect.
Here is a MRE displaying the issue:
#include <GL/glew.h>
#ifndef GLFW_INCLUDE_NONE
#define GLFW_INCLUDE_NONE // GLFW including OpenGL headers causes ambiguity or multiple definition errors.
#endif // GLFW_INCLUDE_NONE
#include <GLFW/glfw3.h>
#include "ImGui/imgui.h"
#include "ImGui/implot.h"
#include "ImGui/implot_internal.h"
#include "ImGui/imgui_impl_glfw.h"
#include "ImGui/imgui_impl_opengl3.h"
#include <iostream>
static bool changedWinSize = false;
static int winWidth = 1280;
static int winHeight = 720;
static void MainLoop(GLFWwindow* window)
{
// Start the Dear ImGui frame
ImGui_ImplOpenGL3_NewFrame();
ImGui_ImplGlfw_NewFrame();
ImGui::NewFrame();
ImGui::SetNextWindowSize({ (float)winWidth, (float)winHeight });
ImGui::SetNextWindowPos({ 0, 0 });
ImGui::Begin("Window", (bool*)0, ImGuiWindowFlags_NoBringToFrontOnFocus | ImGuiWindowFlags_NoCollapse | ImGuiWindowFlags_NoTitleBar | ImGuiWindowFlags_NoMove | ImGuiWindowFlags_NoResize);
static bool fitPlot = false;
if (fitPlot) {
fitPlot = false;
ImPlot::SetNextPlotLimits(-10, 80, -3, 20, ImGuiCond_Always);
}
ImPlotStyle& style = ImPlot::GetStyle();
style.PlotDefaultSize.y = ImGui::GetWindowSize().y / 2 - 45 + (style.PlotDefaultSize.y = ImGui::GetWindowSize().y / 3 - 30) / 2 - 60;
style.PlotDefaultSize.x = ImGui::GetWindowSize().x - 150;
if (ImPlot::BeginPlot("Sim", "Range (m)", "Height (m)", { 0, 0 }, ImPlotFlags_NoLegend | ImPlotFlags_Equal)) {
if (ImPlot::BeginItem("Sim"))
{
if (ImPlot::FitThisFrame() || changedWinSize) {
fitPlot = true;
}
}
ImPlot::EndPlot();
}
ImGui::End();
changedWinSize = false;
// Rendering
ImGui::Render();
glClearColor(0.45f, 0.55f, 0.60f, 1.00f);
glClear(GL_COLOR_BUFFER_BIT);
ImGui_ImplOpenGL3_RenderDrawData(ImGui::GetDrawData());
glfwSwapBuffers(window);
}
static void window_size_callback(GLFWwindow* window, int width, int height)
{
changedWinSize = true;
winWidth = width;
winHeight = height;
MainLoop(window);
}
int main(int argc, char* argv[])
{
const char* glslVersion = "#version 460";
glfwInit();
// Create window with graphics context
GLFWwindow* window = glfwCreateWindow(1280, 720, "Program", NULL, NULL);
glfwMakeContextCurrent(window);
glfwSwapInterval(0); // vsync
//Initialize GLEW + ImGui
glewInit();
// Setup Dear ImGui context
IMGUI_CHECKVERSION();
ImGui::CreateContext();
ImPlot::CreateContext();
// Setup Dear ImGui style
ImGui::StyleColorsLight();
// Setup Platform/Renderer backends
ImGui_ImplGlfw_InitForOpenGL(window, true);
ImGui_ImplOpenGL3_Init(glslVersion);
glfwSetWindowSizeCallback(window, window_size_callback);
while (!glfwWindowShouldClose(window))
{
//Events
glfwPollEvents();
MainLoop(window);
}
ImGui_ImplOpenGL3_Shutdown();
ImGui_ImplGlfw_Shutdown();
ImPlot::DestroyContext();
ImGui::DestroyContext();
glfwDestroyWindow(window);
glfwTerminate();
return 0;
}
From this example, the plot x-axis labels begin at -5 and go to 75. When you try to resize the window (shorten the width) after a pixel or so shrinking the window, the plot lables display -10 and 80, which is exactly what I want. This correct size only occurs when resizing the window and not letting go. It is only when you let go of the window that it reverts back to the incorrect side. I can't figure out where in my code it goes wrong.
ImPlot::FitThisFrame() returns true when double clicking on the plot, however what is weird is that the plot fits correctly if winChangedSize is true, entering that if statement, but when i double click to enter it it doesn't fit the plot correctly?

ImGui with the glad openGL loader throws segmentation fault (core dumped)

I am new to the ImGui library and recently i've been trying out the examples included. Everything worked like a charm until I changed the include (and functions) of gl3w to glad (the loader i would like to use). The moment I swapped between the two loaders I got a segmentation fault exception inside the imgui_impl_glfw_gl3.cpp file. I found a post which suggested that this may happen because of some functions failing to "bind" and producing nullpointers.
I have located the error in line 216 of imgui_impl_glfw_gl3.cpp
this is the code in line 216:
glGetIntegerv(GL_TEXTURE_BINDING_2D, &last_texture);
I have also changed the include file in imgui_impl_glfw_gl3.cpp from gl3w to glad with no results.
This is the main function i am executing (it's the basic opengl3 example of imgui using glad):
#include "gui/imgui.h"
#include "gui/imgui_impl_glfw_gl3.h"
#include <stdio.h>
#include <glad/glad.h> // This example is using gl3w to access OpenGL functions (because it is small). You may use glew/glad/glLoadGen/etc. whatever already works for you.
#include <GLFW/glfw3.h>
static void error_callback(int error, const char* description)
{
fprintf(stderr, "Error %d: %s\n", error, description);
}
int main(int, char**)
{
// Setup window
glfwSetErrorCallback(error_callback);
if (!glfwInit())
return 1;
glfwWindowHint(GLFW_CONTEXT_VERSION_MAJOR, 3);
glfwWindowHint(GLFW_CONTEXT_VERSION_MINOR, 2);
glfwWindowHint(GLFW_OPENGL_PROFILE, GLFW_OPENGL_CORE_PROFILE);
GLFWwindow* window = glfwCreateWindow(1280, 720, "ImGui OpenGL3 example", NULL, NULL);
glfwMakeContextCurrent(window);
glfwSwapInterval(1); // Enable vsync
glfwInit();
// Setup ImGui binding
ImGui_ImplGlfwGL3_Init(window, true);
// Setup style
//ImGui::StyleColorsDark();
ImGui::StyleColorsClassic();
bool show_demo_window = true;
bool show_another_window = false;
bool algo = true;
ImVec4 clear_color = ImVec4(0.45f, 0.55f, 0.60f, 1.00f);
// Main loop
while (!glfwWindowShouldClose(window))
{
glfwPollEvents();
ImGui_ImplGlfwGL3_NewFrame();
// 1. Show a simple window.
// Tip: if we don't call ImGui::Begin()/ImGui::End() the widgets automatically appears in a window called "Debug".
{
static float f = 0.0f;
static int counter = 0;
ImGui::Text("Hello, world!"); // Display some text (you can use a format string too)
ImGui::SliderFloat("float", &f, 0.0f, 1.0f); // Edit 1 float using a slider from 0.0f to 1.0f
ImGui::ColorEdit3("COLORINES", (float*)&clear_color); // Edit 3 floats representing a color
ImGui::Checkbox("Demo Window", &show_demo_window); // Edit bools storing our windows open/close state
ImGui::Checkbox("Booleanooooo", &algo);
ImGui::Checkbox("Another Window", &show_another_window);
if (ImGui::Button("Button")) // Buttons return true when clicked (NB: most widgets return true when edited/activated)
counter++;
ImGui::SameLine();
ImGui::Text("counter = %d", counter);
ImGui::Text("pues se ve que hay texto: %d", algo);
ImGui::Text("Application average %.3f ms/frame (%.1f FPS)", 1000.0f / ImGui::GetIO().Framerate, ImGui::GetIO().Framerate);
}
{
ImGui::Begin("VENTANA WAPA");
ImGui::Text("POS SA QUEDAO BUENA VENTANA");
static float yee = 0.0f;
ImGui::SliderFloat("lel", &yee,1.0f,0.5f);
ImGui::End();
}
// 2. Show another simple window. In most cases you will use an explicit Begin/End pair to name your windows.
if (show_another_window)
{
ImGui::Begin("Another Window", &show_another_window);
ImGui::Text("Hello from another window!");
if (ImGui::Button("Close Me"))
show_another_window = false;
ImGui::End();
}
// 3. Show the ImGui demo window. Most of the sample code is in ImGui::ShowDemoWindow(). Read its code to learn more about Dear ImGui!
if (show_demo_window)
{
ImGui::SetNextWindowPos(ImVec2(650, 20), ImGuiCond_FirstUseEver); // Normally user code doesn't need/want to call this because positions are saved in .ini file anyway. Here we just want to make the demo initial state a bit more friendly!
ImGui::ShowDemoWindow(&show_demo_window);
}
// Rendering
int display_w, display_h;
glfwGetFramebufferSize(window, &display_w, &display_h);
glViewport(0, 0, display_w, display_h);
glClearColor(clear_color.x, clear_color.y, clear_color.z, clear_color.w);
glClear(GL_COLOR_BUFFER_BIT);
ImGui::Render();
glfwSwapBuffers(window);
}
// Cleanup
//ImGui_ImplGlfwGL3_Shutdown();
glfwTerminate();
return 0;
}
I have no clue why this is happenning and I'm pretty new to openGL an ImGui so, any ideas? :(
Glad & gl3w are both extension loader libraries. They generally need to be initialized on a current GL context before use.
The original code called gl3wInit(). Yours is missing any sort of glad init.
Make sure you initialize glad (gladLoadGLLoader((GLADloadproc) glfwGetProcAddress)) after glfwMakeContextCurrent() and before you call any OpenGL functions.
Otherwise all the OpenGL function pointers glad declares will remain NULL. Trying to call NULL function pointers generally doesn't go well for a process.

Gtkglext and gtkglarea gtk3 windows does not work

I need to get working gtk3 with vtk6.
Ive just built (after some effort) gtkglext for gtk3.
Later I built vtkmm1.2 with the win32 patches and I do not see vtk widget.
Due to this, I discovered I cant see any draw in the gtkglext examples.
On the other side, Ive tried the excellent example of gtkglarea (the new, gtk built in one ) with plain opengl from https://www.bassi.io/articles/2015/02/17/using-opengl-with-gtk/
and I get no draw because "fb setup not supported".
It seems the framebuffer is complete, checking gtkglarea.c.
How can i solve this?
Anyone have been able to draw something in gtk with opengl, for windows?
This is the simplest code i've made,and i'm getting the mentioned error enter image description here, and adapting the signals to the new built in GtkGLArea ("render" instead of "draw")
#include <math.h>
#include <gtk/gtk.h>
//#include <gtkgl/gtkglarea.h> //LUCIANO, gtkglarea is built in now
#include <gtk/gtkglarea.h>
#include <GL/gl.h>
gint init(GtkWidget *widget)
{
/* OpenGL functions can be called only if make_current returns true */
gtk_gl_area_make_current(GTK_GL_AREA(widget));
{
GtkAllocation allocation;
gtk_widget_get_allocation (widget, &allocation);
glViewport(0, 0, allocation.width, allocation.height);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glOrtho(0,100, 100,0, -1,1);
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
}
return TRUE;
}
/* When widget is exposed it's contents are redrawn. */
gboolean draw (GtkWidget *widget, cairo_t *cr, gpointer data)
{
/* OpenGL functions can be called only if make_current returns true */
gtk_gl_area_make_current(GTK_GL_AREA(widget));
{
/* Draw simple triangle */
glClearColor(0,0,0,1);
glClear(GL_COLOR_BUFFER_BIT);
glColor3f(1,1,1);
glBegin(GL_TRIANGLES);
glVertex2f(10,10);
glVertex2f(10,90);
glVertex2f(90,90);
glEnd();
/* Swap backbuffer to front */
//ggla_area_swap_buffers(GGLA_AREA(widget));
gtk_gl_area_attach_buffers(GTK_GL_AREA(widget));
}
return TRUE;
}
/* When glarea widget size changes, viewport size is set to match the new size */
gint reshape(GtkWidget *widget, GdkEventConfigure *event)
{
/* OpenGL functions can be called only if make_current returns true */
//if (ggla_area_make_current(GTK_GL_AREA(widget)))
gtk_gl_area_make_current(GTK_GL_AREA(widget));
{
GtkAllocation allocation;
gtk_widget_get_allocation (widget, &allocation);
glViewport(0, 0, allocation.width, allocation.height);
}
return TRUE;
}
int main(int argc, char **argv)
{
GtkWidget *window,*glarea;
gchar *info_str;
/* Attribute list for gtkglarea widget. Specifies a
list of Boolean attributes and enum/integer
attribute/value pairs. The last attribute must be
GGLA_NONE. See glXChooseVisual manpage for further
explanation.
*/
int attrlist[] = {
// GGLA_RGBA,
// GGLA_RED_SIZE,1,
// GGLA_GREEN_SIZE,1,
// GGLA_BLUE_SIZE,1,
// GGLA_DOUBLEBUFFER,
// GGLA_NONE
};
/* initialize gtk */
gtk_init(&argc, &argv);
/* Check if OpenGL is supported. */
// if (ggla_query() == FALSE) {
// g_print("OpenGL not supported\n");
// return 0;
// }
/* Create new top level window. */
window = gtk_window_new( GTK_WINDOW_TOPLEVEL);
gtk_window_set_title(GTK_WINDOW(window), "Simple");
gtk_container_set_border_width(GTK_CONTAINER(window), 10);
/* Quit form main if got delete event */
g_signal_connect(G_OBJECT(window), "delete-event",
G_CALLBACK(gtk_main_quit), NULL);
/* Create new OpenGL widget. */
//glarea = GTK_WIDGET(gtk_gl_area_new(attrlist));
glarea = gtk_gl_area_new();
/* Events for widget must be set before X Window is created */
gtk_widget_set_events(GTK_WIDGET(glarea),
GDK_EXPOSURE_MASK|
GDK_BUTTON_PRESS_MASK);
init(glarea);
/* Connect signal handlers */
/* Redraw image when exposed. */
// g_signal_connect(G_OBJECT(glarea), "draw",
// G_CALLBACK(draw), NULL);
// /* When window is resized viewport needs to be resized also. */
// g_signal_connect(G_OBJECT(glarea), "configure-event",
// G_CALLBACK(reshape), NULL);
// /* Do initialization when widget has been realized. */
// g_signal_connect(G_OBJECT(glarea), "realize",
// G_CALLBACK(init), NULL);
g_signal_connect(G_OBJECT(glarea), "render",
G_CALLBACK(draw), NULL);
/* When window is resized viewport needs to be resized also. */
// g_signal_connect(G_OBJECT(glarea), "resize",
// G_CALLBACK(reshape), NULL);
/* Do initialization when widget has been realized. */
// g_signal_connect(G_OBJECT(glarea), "create-context",
// G_CALLBACK(init), NULL);
/* set minimum size */
gtk_widget_set_size_request(GTK_WIDGET(glarea), 100,100);
/* put glarea into window and show it all */
gtk_container_add(GTK_CONTAINER(window),GTK_WIDGET(glarea));
gtk_widget_show(GTK_WIDGET(glarea));
gtk_widget_show(GTK_WIDGET(window));
/* vendor dependent version info string */
// info_str = ggla_get_info();
// g_print(info_str);
// g_free(info_str);
gtk_main();
return 0;
}

Make transparency not show what is behind the window in opengl with c++

I am making a 2 dimensional image in opengl with C++, and am running into an interesting issue. Whenever I try to draw a partially transparent polygon on my image, it makes the window itself partially transparent where the polygon is. For example, I can see whatever is behind my window (e.g. my code) when I am running the program (which I don't want). I can also see the image behind the polygon (which I do want). Is there any way I can turn the "transparent window" behavior off? I have included what I feel to be relevant portions of the code below:
glClearColor(0.0f, 0.0f, 0.0f, 0.0f); // I have tried 1.0f for the alpha value too
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
glHint(GL_LINE_SMOOTH_HINT, GL_NICEST);
glHint(GL_POLYGON_SMOOTH_HINT, GL_NICEST);
glEnable(GL_BLEND);
glEnable(GL_LINE_SMOOTH);
glEnable(GL_POLYGON_SMOOTH);
glPolygonMode (GL_FRONT_AND_BACK, GL_FILL);
glHint(GL_POINT_SMOOTH_HINT, GL_FASTEST);
glDisable(GL_POINT_SMOOTH);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glMatrixMode (GL_MODELVIEW);
glLoadIdentity ();
// other code to draw my opaque "background" object
// Draw my partially transparent quad (note: this is where the window itself becomes partially transparent)
glBegin(GL_QUADS); // Begin drawing quads
glColor4f(1.0,1.0,1.0,0.5); // Make a white quad with .5 alpha
glVertex2f(-0.5, 0.5);
glVertex2f(0.5, .05);
glVertex2f(0.5, -0.5);
glVertex2f(-0.5, -0.5);
glEnd();
Other relevant information:
I am running CentOS 6
I am fairly new to opengl, and am working on the code after a prior developer, so I could be missing something trivial
It is using the X Windows system
Here is the X Window creation code further debug, the problem is likely here rather than the opengl code above.
/* The simplest possible Linux OpenGL program? Maybe...
Modification for creating a RGBA window (transparency with compositors)
by Wolfgang 'datenwolf' Draxinger
(c) 2002 by FTB. See me in comp.graphics.api.opengl
(c) 2011 Wolfgang Draxinger. See me in comp.graphics.api.opengl and on StackOverflow
License agreement: This source code is provided "as is". You
can use this source code however you want for your own personal
use. If you give this source code to anybody else then you must
leave this message in it.
--
<\___/>
/ O O \
\_____/ FTB.
--
datenwolf
------------------------------------------------------------------------*/
static void createTheWindow() {
XEvent event;
int x, y, attr_mask;
XSizeHints hints;
XWMHints *StartupState;
XTextProperty textprop;
XSetWindowAttributes attr;
static char *title = "Fix me";
/* Connect to the X server */
Xdisplay = XOpenDisplay(NULL);
if (!Xdisplay)
{
fatalError("Couldn't connect to X server\n");
}
Xscreen = DefaultScreen(Xdisplay);
Xroot = RootWindow(Xdisplay, Xscreen) ;
fbconfigs = glXChooseFBConfig(Xdisplay, Xscreen, VisData, &numfbconfigs);
for (int i = 0; i < numfbconfigs; i++)
{
visual = (XVisualInfo_CPP*) glXGetVisualFromFBConfig(Xdisplay,
fbconfigs[i]);
if (!visual)
continue;
pictFormat = XRenderFindVisualFormat(Xdisplay, visual->visual);
if (!pictFormat)
continue;
if (pictFormat->direct.alphaMask > 0)
{
fbconfig = fbconfigs[i];
break;
}
}
/* Create a colormap - only needed on some X clients, eg. IRIX */
cmap = XCreateColormap(Xdisplay, Xroot, visual->visual, AllocNone);
/* Prepare the attributes for our window */
attr.colormap = cmap;
attr.border_pixel = 0;
attr.event_mask = StructureNotifyMask | EnterWindowMask | LeaveWindowMask
| ExposureMask | ButtonPressMask | ButtonReleaseMask
| OwnerGrabButtonMask | KeyPressMask | KeyReleaseMask;
attr.background_pixmap = None;
attr_mask = CWBackPixmap | CWColormap | CWBorderPixel | CWEventMask; /* What's in the attr data */
width = DisplayWidth(Xdisplay, DefaultScreen(Xdisplay)) ;
height = DisplayHeight(Xdisplay, DefaultScreen(Xdisplay)) ;
x = width / 2, y = height / 2;
// x=0, y=10;
/* Create the window */
attr.do_not_propagate_mask = NoEventMask;
WindowHandle = XCreateWindow(Xdisplay, /* Screen */
Xroot, /* Parent */
x, y, width, height,/* Position */
1,/* Border */
visual->depth,/* Color depth*/
InputOutput,/* klass */
visual->visual,/* Visual */
attr_mask, &attr);/* Attributes*/
if (!WindowHandle)
{
fatalError("Couldn't create the window\n");
}
/* Configure it... (ok, ok, this next bit isn't "minimal") */
textprop.value = (unsigned char*) title;
textprop.encoding = XA_STRING;
textprop.format = 8;
textprop.nitems = strlen(title);
hints.x = x;
hints.y = y;
hints.width = width;
hints.height = height;
hints.flags = USPosition | USSize;
StartupState = XAllocWMHints();
StartupState->initial_state = NormalState;
StartupState->flags = StateHint;
XSetWMProperties(Xdisplay, WindowHandle, &textprop, &textprop,/* Window title/icon title*/
NULL, 0,/* Argv[], argc for program*/
&hints, /* Start position/size*/
StartupState,/* Iconised/not flag */
NULL);
XFree(StartupState);
/* Open it, wait for it to appear */
int event_base, error_base = 0;
XMapWindow(Xdisplay, WindowHandle);
// }
XIfEvent(Xdisplay, &event, WaitForMapNotify, (char*) &WindowHandle);
/* Set the kill atom so we get a message when the user tries to close the window */
if ((del_atom = XInternAtom(Xdisplay, "WM_DELETE_WINDOW", 0)) != None)
{
XSetWMProtocols(Xdisplay, WindowHandle, &del_atom, 1);
}
}
Here are the settings for VisData:
static int VisData[] = { GLX_RENDER_TYPE, GLX_RGBA_BIT, GLX_DRAWABLE_TYPE,
GLX_WINDOW_BIT, GLX_DOUBLEBUFFER, True, GLX_RED_SIZE, 1, GLX_GREEN_SIZE,
1, GLX_BLUE_SIZE, 1, GLX_ALPHA_SIZE, 1, GLX_DEPTH_SIZE, 1,
None
};
Here is where the rendering context is created:
static void createTheRenderContext() {
/* See if we can do OpenGL on this visual */
int dummy;
if (!glXQueryExtension(Xdisplay, &dummy, &dummy))
{
fatalError("OpenGL not supported by X server\n");
}
/* Create the OpenGL rendering context */
RenderContext = glXCreateNewContext(Xdisplay, fbconfig, GLX_RGBA_TYPE, 0,
True);
if (!RenderContext)
{
fatalError("Failed to create a GL context\n");
}
GLXWindowHandle = glXCreateWindow(Xdisplay, fbconfig, WindowHandle, NULL);
/* Make it current */
if (!glXMakeContextCurrent(Xdisplay, GLXWindowHandle, GLXWindowHandle,
RenderContext))
{
fatalError("glXMakeCurrent failed for window\n");
}
}
What ratchet freak suggestet (Aero Glass effect in Windows) does not happen by accident, because one has to manually enable DWM transparency for this to happen.
However in X11/GLX it is perfectly possible to end up with a visual mode that has an Alpha Channel by default. If you want to get realiably a window that does or does not have an alpha channel the code gets a bit more complex than what most toolkits do.
The code you're using looks strikingly familiar. To be specific it seems to originate from a codesample I wrote about how to create a transparent window (you see where this is going), namely this code:
https://github.com/datenwolf/codesamples/blob/master/samples/OpenGL/x11argb_opengl/x11argb_opengl.c
The key sequence is this:
fbconfigs = glXChooseFBConfig(Xdisplay, Xscreen, VisData, &numfbconfigs);
fbconfig = 0;
for(int i = 0; i<numfbconfigs; i++) {
visual = (XVisualInfo*) glXGetVisualFromFBConfig(Xdisplay, fbconfigs[i]);
if(!visual)
continue;
pict_format = XRenderFindVisualFormat(Xdisplay, visual->visual);
if(!pict_format)
continue;
fbconfig = fbconfigs[i];
if(pict_format->direct.alphaMask > 0) {
break;
}
}
What this does is, it selects an X11 Visual that matches one of the previously selected FBConfigs that also contains an alpha mask.
If I had to make a bet I suspect that the VisData array you passed to glXChooseFBConfig does not specify an alpha channel. So what happens is, that you may end up with a window that has an X11 alpha mask, but not an alpha channel accessible to OpenGL.
Since I never intended that code to be used for windows that don't have an alpha channel this code does only whats originally intended if VisData does select for an alpha channel.
You have now two options:
implement a complementary test if(pict_format->direct.alphaMask == 0 && no_alpha_in(VisData)) break;
select for an alpha channel in VisData and clear the alpha channel to 1.0 with OpenGL glClearColor(…,…,…,1.0f);
This is not a opengl problem, but rather the kind of window you are creating. I suspect you running a window manager with supports transparency effects. Either way, what probably is happening is that, when you render the transparent poly, the window canvas ends up with some alpha, and your window manager assumes that you want the background transparent. Turn off all advanced effects of your window manager to check.
I am not familiar with window creation code using xlib, but it probably has to do with the kind of window you are creating.

Use OpenGL Bitmap Fonts to put text onto the screen

I am now learning OpenGL NeHe production.When I come to read Lesson 13 Bitmap Fonts,I encounter a problem.I write my code using glut.And my PC system is Windows7.I run my code on Microsoft Visual Studio 2008 and there is not any error.But nothing appears in the window.I don't know what is wrong.What may cause this problem generally?Did I miss some settings?
Here is my code:
#pragma comment(lib,"GLAUX.LIB")
#include <GL/glut.h>
#include <windows.h>
#include <GL/glaux.h>
#include <stdio.h>
#include <stdarg.h>
#include <math.h>
HDC hDC = NULL;
GLuint base;//the first display list we create
GLfloat cnt1,cnt2;//move on the screen or set color
GLvoid buildFont() // Build Our Bitmap Font
{
HFONT font; // Windows Font ID
HFONT oldfont; // Used For Good House Keeping
base = glGenLists(96); // Storage For 96 Characters
font = CreateFont(
-24, // Height Of Font
0, // Width Of Font
0, // Angle Of Escapement
0, // Orientation Angle
FW_BOLD, // Font Weight
FALSE, // Italic
FALSE, // Underline
FALSE, // Strikeout
ANSI_CHARSET, // Character Set Identifier
OUT_TT_PRECIS, // Output Precision
CLIP_DEFAULT_PRECIS, // Clipping Precision
ANTIALIASED_QUALITY, // Output Quality
FF_DONTCARE|DEFAULT_PITCH, // Family And Pitch
"Times New Roman"); // Font Name
oldfont = (HFONT)SelectObject(hDC, font); // Selects The Font We Want
wglUseFontBitmaps(hDC, 32, 96, base); // Builds 96 Characters Starting At Character 32
SelectObject(hDC, oldfont); // Selects The Font We Want
DeleteObject(font); // Delete The Font
}
void killFont()
{
glDeleteLists(base,96);
}
void glPrint(const char *fmt, ...) // Custom GL "Print" Routine
{
char text[256]; // Holds Our String
va_list ap; // Pointer To List Of Arguments
if (fmt == NULL) // If There's No Text
{
printf("the string to print is NULL!\n");
return; // Do Nothing
}
va_start(ap, fmt); // Parses The String For Variables
vsprintf(text, fmt, ap); // And Converts Symbols To Actual Numbers
va_end(ap); // Results Are Stored In Text
glPushAttrib(GL_LIST_BIT); // Pushes The Display List Bits
glListBase(base - 32); // Sets The Base Character to 32
glCallLists(strlen(text), GL_UNSIGNED_BYTE, text); // Draws The Display List Text
glPopAttrib(); // Pops The Display List Bits
}
int init(GLvoid) // All Setup For OpenGL Goes Here
{
glShadeModel(GL_SMOOTH); // Enable Smooth Shading
glClearColor(0.0f, 0.0f, 0.0f, 0.5f); // Black Background
glClearDepth(1.0f); // Depth Buffer Setup
glEnable(GL_DEPTH_TEST); // Enables Depth Testing
glDepthFunc(GL_LEQUAL); // The Type Of Depth Testing To Do
glHint(GL_PERSPECTIVE_CORRECTION_HINT, GL_NICEST); // Really Nice Perspective Calculations
buildFont(); // Build The Font
return TRUE; // Initialization Went OK
}
void display() // Here's Where We Do All The Drawing
{
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT); // Clear Screen And Depth Buffer
glLoadIdentity(); // Reset The Current Modelview Matrix
glTranslatef(0.0f,0.0f,-1.0f); // Move One Unit Into The Screen
// Pulsing Colors Based On Text Position
glColor3f(1.0f*float(cos(cnt1)),1.0f*float(sin(cnt2)),1.0f-0.5f*float(cos(cnt1+cnt2)));
// Position The Text On The Screen
glRasterPos2f(-0.45f+0.05f*float(cos(cnt1)), 0.32f*float(sin(cnt2)));
glPrint("Active OpenGL Text With NeHe - %7.2f", cnt1); // Print GL Text To The Screen
glutSwapBuffers();// Everything Went OK
}
void spinDisplay()
{
cnt1 += 0.051f;
cnt2 += 0.005f;
printf("cnt1: %f\n",cnt1);
printf("cnt2: %f\n",cnt2);
}
void reshape(int w,int h)
{
if (0 == h)
h = 1;
glViewport(0,0,(GLsizei)w,(GLsizei)h);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
gluPerspective(60.0f,(GLfloat)w / (GLfloat)h,1,100);
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
}
int main(int argc,char** argv)
{
glutInit(&argc,argv);
glutInitDisplayMode(GLUT_DOUBLE | GLUT_RGB);
glutInitWindowSize(600,600);
glutInitWindowPosition(100,100);
glutCreateWindow("Bitmap Fonts");
init();
glutDisplayFunc(display);
glutReshapeFunc(reshape);
glutIdleFunc(spinDisplay);
//glutKeyboardFunc(keyboard);
glutMainLoop();
killFont();
return 0;
}
This is the result on Visual Studio 2008:
you can use glRasterPos and glutBitmapCharacter this is present in glut
glRasterPos3f( 30.0f , 25.0f ,0.0f );
glutBitmapCharacter( GLUT_BITMAP_HELVETICA_18 , 'A');
or use glutBitmapString (supported in freeglut current).
glRasterPos3f(30.0f , 20.0f ,0.0f);
glutBitmapString( GLUT_BITMAP_HELVETICA_18 , "Hello World!" );
if you can't use glutBitmapString to print a string you can use a loop
char *a="Hello World!";
glRasterPos3f( 30.0f , 25.0f ,0.0f );
for(i = 0; a[i] != '\0'; i++)
glutBitmapCharacter( GLUT_BITMAP_HELVETICA_18 , a[i]);
There isn't anything wrong with your code.
I'm taking a class in OpenGL using glut. We have encountered a problem in class where the labs computers correctly display the characters in the correct colors, but a few of the students' laptops will only display the characters in black. All the machines are running windows 7, so we suspect that it has to do with what version of OpenGL is on the machine.
Anyway change your background color to white ( or something that will easily show black text). You should see your text if your positioning is correct.
GLUT is outdated, and no longer maintained. Maybe this is the reason of problem on Win7. The last GLUT version (3.7) dating back to August 1998.
You can try freeglut, a full compatible alternative to GLUT to get a 100% replacement without changing anything in source.
I've just tried NeHe Lesson 13 project (based on GLUT) on
Vista x64 SP2 with MS Visual Studio 2005 SP2
Windows 7 (64bit) with MS Visual Studio 2010 SP1 (32bit debug app)
Both of them works fine! But under Win7 with MS VS2010 the 64bit debug version cannot be built because of some unresolved external.
Did you build a 32bit or a 64bit version?
Have you already tried other NeHe downloads without GLUT? (http://nehe.gamedev.net/data/lessons/vc/lesson13.zip)
You can try to update your graphic driver and try to switch on/off Windows Aero Theme, it helps often, because of different Pixel Format Descriptor.
I hope that helps.