Why does this C++ / OpenGL program run twice? - c++

After most of the curriculum, they spring C++ on us for the senior year. Sigh. So I'm underwater trying to learn it and OpenGL, the latter being the actual subject of the class.
Please, why does this thing run twice? This assignment has already been turned in and graded, so but I just can't find any good online guide to OpenGL. Thanks for any thoughts.
#ifdef __APPLE__
#include <GLUT/glut.h>
#include <OpenGL/gl.h>
#else
#include <GL/glut.h>
#endif
#include <stdlib.h>
int width = 800, height = 600;
float xmin = -(width / 2), ymin = -(height / 2), xmax = width / 2, ymax = height / 2;
GLubyte bitmap[72] = { 0x00, 0x00, 0x00,
0x40, 0x00, 0x02,
0x20, 0x00, 0x04,
0x10, 0x38, 0x08,
0x09, 0x63, 0x10,
0x06, 0x00, 0xA0,
0x08, 0x00, 0x20,
0x10, 0x00, 0x10,
0x10, 0x00, 0x10,
0x10, 0x00, 0x08,
0x20, 0x00, 0x08,
0x20, 0x10, 0x08,
0x20, 0x18, 0x08,
0x10, 0x14, 0x08,
0x10, 0x12, 0x10,
0x10, 0x11, 0x10,
0x08, 0x10, 0x20,
0x04, 0x10, 0x40,
0x01, 0x87, 0x00,
0x00, 0x78, 0x00,
0x00, 0x00, 0x00,
0x00, 0x00, 0x00,
0x00, 0x00, 0x00,
0x00, 0x00, 0x00
};
void init(void) {
// Set display-window color to white.
glClearColor(0.0, 0.0, 1.0, 0.0);
// Set projection parameters.
glMatrixMode(GL_PROJECTION);
gluOrtho2D(xmin,xmax,ymin,ymax);
// Clear display window.
glClear(GL_COLOR_BUFFER_BIT);
glutSwapBuffers();
}
// Windows redraw function
void winReshapeFcn(GLint newWidth, GLint newHeight) {
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
gluOrtho2D(-(GLdouble)width / 2, (GLdouble)width / 2, -(GLdouble)height / 2, (GLdouble)height / 2);
glClear(GL_COLOR_BUFFER_BIT);
}
void drawText() {
int x = (int) xmin + 20, y = (int) ymax - 20, count = 0;
char what [] = { 'R', 'e', 'c', 't', 'a', 'n', 'g', 'l', 'e', 's' };
float color = 1.0;
glRasterPos2i(x, y);
do {
glColor3f(color,color,color);
color = color - 0.1;
glutBitmapCharacter(GLUT_BITMAP_9_BY_15, what[count]);
y = y - 20;
glRasterPos2i(x, y);
count = count + 1;
} while (count <= 9);
}
void drawRectangles() {
int h = (int) ymax, x1 = -h, y1 = h, x2 = h, y2 = -h, count = 0, delta, factor = 5;
do {
glBegin(GL_LINES);
glVertex2i(x1,h);
glVertex2i(h,y1);
glVertex2i(h,y1);
glVertex2i(x2,-h);
glVertex2i(x2,-h);
glVertex2i(-h,y2);
glVertex2i(-h,y2);
glVertex2i(x1,h);
glEnd();
h = h - factor; delta = factor * count;
x1 = -h + delta; y1 = h - delta; x2 = h - delta; y2 = -h + delta;
count = count + 1;
} while (x1 < h);
}
void drawBitmaps() {
int count = 0;
GLfloat xorigin = xmin + (xmax - ymax) / 2.0;
// Needed for reading from memory. 1 indicates byte alignment
glPixelStorei(GL_UNPACK_ALIGNMENT, 1);
// Center the bitmap image
glRasterPos2i(0, 0);
do {
glBitmap(24.0, 24.0, xorigin, ymax, 0.0, 24.0, bitmap);
count = count + 24;
Sleep(150);
glutSwapBuffers();
} while ((count < width) && (count < height));
}
void displayFunction(void) {
// Clear display window.
glClear(GL_COLOR_BUFFER_BIT);
// Set graphic objects color to Red or change for your choice
drawText();
glColor3f(1.0, 1.0, 0.0);
drawRectangles();
drawBitmaps();
// Execute OpenGL functions
glFlush();
}
void main(int argc, char** argv) {
// Initialize GLUT.
glutInit(&argc, argv);
// Set display mode.
glutInitDisplayMode(GLUT_SINGLE | GLUT_RGB);
// Set top-left display-window position.
glutInitWindowPosition((glutGet(GLUT_SCREEN_WIDTH) - width) / 2, (glutGet(GLUT_SCREEN_HEIGHT) - height) / 2);
// Set display-window width and height.
glutInitWindowSize(width, height);
// Create display window.
glutCreateWindow("Michael Powers - Homework 2");
// Execute initialization procedure.
init();
// Send graphics to display window.
glutDisplayFunc(displayFunction);
// Window reshape call
glutReshapeFunc(winReshapeFcn);
// Display everything and wait.
glutMainLoop();
}

The GLUT display function displayFunction gets called every time the graphics need to be rendered again. On a real OpenGL app it would be called continuously, controlled by a timer. Here it gets called once when the window it opened. But depending on the OS, it may be called multiple times, for example if the window needs to be refreshed because it got activated.
In the code the animation is controlled by Sleep(150) and glutSwapBuffers() during the execution of displayFunction(). So the application blocks during the animation, but the graphics are still shown because of the glutSwapBuffers() calls.
Normally a display function should execute quickly (and never block/wait), and call glFlush() and glutSwapBuffers() only once at the end.
A better implementation would be: The state of the animation (i.e. the number of clock icons) is stored in a global variable int state = 0. displayFunction() always draws that number of clocks without waiting, and then exits. Before starting the main loop a timer is registered with glutTimerFunc, with a function that increments state, and then calls glutPostRedisplay(). This schedules GLUT to recall the display function. Then the app also remains responsive during the animation, and can be quit by closing the window.

Related

SDL_RenderDrawPoint not working

If i use this code:
SDL_SetRenderDrawColor( renderer.get(), 0x00, 0x00, 0xFF, 0xFF );
SDL_RenderDrawLine( renderer.get(), 0, 480 / 2, 640, 480 / 2 );
//Draw vertical line of yellow dots
SDL_SetRenderDrawColor( renderer.get(), 0xFF, 0xFF, 0x00, 0xFF );
for( int i = 0; i < 480; i += 4 )
{
SDL_RenderDrawPoint( renderer.get(), 640 / 2, i );
}
It will draw yellow points and blue line, but if i comment SDL_RenderDrawLine, it will draw nothing. What is wrong?
This will happen too, if i use this code and somewhere in code (after a while) i want to use SDL_RenderDrawPoint or SDL_RenderDrawPoints.
I am using lastest SDL library and Linux Mint.

GL_TEXTURE_RECTANGLE_ARB not working with shader and OS X

I've got an OSX app that uses OpenGL. I'm drawing most of my stuff with textures of the type GL_TEXTURE_2D, and everything works fine as long as I stick to GL_TEXTURE_2D. But I need to have a couple of textures of type GL_TEXTURE_RECTANGLE_ARB.
To create a texture of type GL_TEXTURE_RECTANGLE_ARB I do the following:
// 4x4 RGBA texture data
GLubyte TexData[4*4*4] =
{
0x00, 0x00, 0xff, 0xff, 0x00, 0xff, 0x00, 0xff,
0xff, 0x00, 0x00, 0xff, 0x00, 0x00, 0xff, 0xff,
0x00, 0xff, 0x00, 0xff, 0xff, 0x00, 0x00, 0xff,
0x00, 0x00, 0xff, 0xff, 0x00, 0xff, 0x00, 0xff,
0xff, 0x00, 0x00, 0xff, 0x00, 0x00, 0xff, 0xff,
0x00, 0xff, 0x00, 0xff, 0xff, 0x00, 0x00, 0xff,
0x00, 0x00, 0xff, 0xff, 0x00, 0xff, 0x00, 0xff,
0xff, 0x00, 0x00, 0xff, 0x00, 0x00, 0xff, 0xff,
};
GLuint myArbTexture;
glEnable(GL_TEXTURE_RECTANGLE_ARB);
glGenTextures(1, &myArbTexture);
glBindTexture(GL_TEXTURE_RECTANGLE_ARB, myArbTexture);
glTexParameteri(GL_TEXTURE_RECTANGLE_ARB, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_RECTANGLE_ARB, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_RECTANGLE_ARB, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
glTexParameteri(GL_TEXTURE_RECTANGLE_ARB, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
glTexParameterf( GL_TEXTURE_RECTANGLE_ARB, GL_TEXTURE_MAX_LEVEL, 0 );
glTexImage2D(GL_TEXTURE_RECTANGLE_ARB, 0, GL_RGBA, 4, 4, 0, GL_RGBA, GL_UNSIGNED_BYTE, TexData);
To draw the texture I do the following:
SetupMyShader();
SetupMyMatrices();
glEnable(GL_TEXTURE_RECTANGLE_ARB);
glBindTexture(GL_TEXTURE_RECTANGLE_ARB, myArbTexture);
DrawMyQuads();
My shader is very simple:
void main (void)
{
gl_FragColor = texture2D(Tex, v_texcoords)*u_color;
}
Using the above code, my shader always references the last texture used in:
glBindTexture(GL_TEXTURE_2D, lastTexture)
instead of referencing the texture specified in:
glBindTexture(GL_TEXTURE_RECTANGLE_ARB, myArbTexture);
Some things to make note of:
After every GL call I'm checking for errors glGetError() and I don't get any errors.
If I replace GL_TEXTURE_RECTANGLE_ARB with GL_TEXTURE_2D everything works fine
This is not an issue with uv (st) coordinates. When drawing with GL_TEXTURE_2D my uv are 0->1 and with GL_TEXTURE_RECTANGLE_ARB my uv are 0->texWidth or texHeight
I'm running a pretty new mac that reports GL Version 2.1 NVIDIA-10.2.1 310.41.15f01 so it certainly should support GL_TEXTURE_RECTANGLE_ARB. (I would think anyway)
I've narrowed down the issue enough to be pretty darn sure that when rendering the shader always refers to the previous texture that was bound as GL_TEXTURE_2D. My quad always draws in the right place and with sensible uv coords, it's just that it is referencing the wrong texture.
So, anyone got any guesses what I'm missing? Is there some call that I should be making other than glBindTexture that my shader would need when using GL_TEXTURE_RECTANGLE_ARB?
You will need to update your shader code to use rectangle textures. The uniform needs to be declared as:
uniform sampler2DRect Tex;
and accessed with:
gl_FragColor = texture2DRect(Tex, v_texcoords)*u_color;
Another aspect to keep in mind is that texture coordinates are defined differently for rectangle textures. Instead of the normalized texture coordinates in the range 0.0 to 1.0 used by all other texture types, rectangle textures use non-normalized texture coordinates in the range 0.0 to width and 0.0 to height.

binary translation for a glPolygonStipple argument

I am trying to learn the language from http://www.glprogramming.com/red/chapter02.html At that site there is an example on how to use glPolygonStipple. My understanding is the hexadecimal numbers in the GLubyte arrays are held to translate to binary numbers so it can make a bitmap. I just was wondering how exactly are the elements in these arrays making these patterns.
Here is the example from the website on this:
#include <Windows.h>
#include <GL/gl.h>
#include <GL/glut.h>
void display(void)
{
GLubyte fly[] = {
0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
0x03, 0x80, 0x01, 0xC0, 0x06, 0xC0, 0x03, 0x60,
0x04, 0x60, 0x06, 0x20, 0x04, 0x30, 0x0C, 0x20,
0x04, 0x18, 0x18, 0x20, 0x04, 0x0C, 0x30, 0x20,
0x04, 0x06, 0x60, 0x20, 0x44, 0x03, 0xC0, 0x22,
0x44, 0x01, 0x80, 0x22, 0x44, 0x01, 0x80, 0x22,
0x44, 0x01, 0x80, 0x22, 0x44, 0x01, 0x80, 0x22,
0x44, 0x01, 0x80, 0x22, 0x44, 0x01, 0x80, 0x22,
0x66, 0x01, 0x80, 0x66, 0x33, 0x01, 0x80, 0xCC,
0x19, 0x81, 0x81, 0x98, 0x0C, 0xC1, 0x83, 0x30,
0x07, 0xe1, 0x87, 0xe0, 0x03, 0x3f, 0xfc, 0xc0,
0x03, 0x31, 0x8c, 0xc0, 0x03, 0x33, 0xcc, 0xc0,
0x06, 0x64, 0x26, 0x60, 0x0c, 0xcc, 0x33, 0x30,
0x18, 0xcc, 0x33, 0x18, 0x10, 0xc4, 0x23, 0x08,
0x10, 0x63, 0xC6, 0x08, 0x10, 0x30, 0x0c, 0x08,
0x10, 0x18, 0x18, 0x08, 0x10, 0x00, 0x00, 0x08};
GLubyte halftone[] = {
0xAA, 0xAA, 0xAA, 0xAA, 0x55, 0x55, 0x55, 0x55,
0xAA, 0xAA, 0xAA, 0xAA, 0x55, 0x55, 0x55, 0x55,
0xAA, 0xAA, 0xAA, 0xAA, 0x55, 0x55, 0x55, 0x55,
0xAA, 0xAA, 0xAA, 0xAA, 0x55, 0x55, 0x55, 0x55,
0xAA, 0xAA, 0xAA, 0xAA, 0x55, 0x55, 0x55, 0x55,
0xAA, 0xAA, 0xAA, 0xAA, 0x55, 0x55, 0x55, 0x55,
0xAA, 0xAA, 0xAA, 0xAA, 0x55, 0x55, 0x55, 0x55,
0xAA, 0xAA, 0xAA, 0xAA, 0x55, 0x55, 0x55, 0x55,
0xAA, 0xAA, 0xAA, 0xAA, 0x55, 0x55, 0x55, 0x55,
0xAA, 0xAA, 0xAA, 0xAA, 0x55, 0x55, 0x55, 0x55,
0xAA, 0xAA, 0xAA, 0xAA, 0x55, 0x55, 0x55, 0x55,
0xAA, 0xAA, 0xAA, 0xAA, 0x55, 0x55, 0x55, 0x55,
0xAA, 0xAA, 0xAA, 0xAA, 0x55, 0x55, 0x55, 0x55,
0xAA, 0xAA, 0xAA, 0xAA, 0x55, 0x55, 0x55, 0x55,
0xAA, 0xAA, 0xAA, 0xAA, 0x55, 0x55, 0x55, 0x55,
0xAA, 0xAA, 0xAA, 0xAA, 0x55, 0x55, 0x55, 0x55};
glClear (GL_COLOR_BUFFER_BIT);
glColor3f (1.0, 1.0, 1.0);
/* draw one solid, unstippled rectangle, */
/* then two stippled rectangles */
glRectf (25.0, 25.0, 125.0, 125.0);
glEnable (GL_POLYGON_STIPPLE);
glPolygonStipple (fly);
glRectf (125.0, 25.0, 225.0, 125.0);
glPolygonStipple (halftone);
glRectf (225.0, 25.0, 325.0, 125.0);
glDisable (GL_POLYGON_STIPPLE);
glFlush ();
}
void init (void)
{
glClearColor (0.0, 0.0, 0.0, 0.0);
glShadeModel (GL_FLAT);
}
void reshape (int w, int h)
{
glViewport (0, 0, (GLsizei) w, (GLsizei) h);
glMatrixMode (GL_PROJECTION);
glLoadIdentity ();
gluOrtho2D (0.0, (GLdouble) w, 0.0, (GLdouble) h);
}
int main(int argc, char** argv)
{
glutInit(&argc, argv);
glutInitDisplayMode (GLUT_SINGLE | GLUT_RGB);
glutInitWindowSize (350, 150);
glutCreateWindow (argv[0]);
init ();
glutDisplayFunc(display);
glutReshapeFunc(reshape);
glutMainLoop();
return 0;
}
Binary is a base 2 number system, which means each digit is a 0 or a 1. This lends itself very well to stipple patterns, because a 0 means "don't draw this pixel", and a 1 means "draw this pixel". The stipple pattern used across a polygon is 2 dimensional, so you have several rows of these 0's and 1's, building up a pattern of pixels.
To be specific, you have 32 rows of 32 binary digits (bits) each.
Unfortunately you can't enter binary numbers into the source code of languages such as C and C++. Hexadecimal is commonly used instead. It's a base 16 number system, so each digit can be 0-9 or A-F (the letters A-F represents decimal values 10-15).
The nice thing about it is that each digit neatly corresponds to a pattern of 4 binary digits (or bits). That makes it very easy to convert. Here's how they correspond:
Hex Binary
0 0000
1 0001
2 0010
3 0011
4 0100
5 0101
6 0110
7 0111
8 1000
9 1001
A 1010
B 1011
C 1100
D 1101
E 1110
F 1111
(If you're not familiar with how numbers are represented in binary, then that might look strange. There should be plenty of tutorials and explanations online though if you want to learn more about the details.)
When you see a hex number such as 0x31, you can firstly ignore the "0x" prefix -- that just indicates that the number is in hexadecimal. To figure out the binary equivalent, just look up the other digits in the table, one at a time, to get the binary equivalent. In this case, it's a 3 followed by a 1, which means the binary pattern is 0011 0001 (without the space).
In a stipple pattern, that means it will leave 2 pixels blank, draw 2 pixels, leave 3 pixels blank, and finally draw 1 pixel.
In the example code you posted, you can see several pairs of hex digits. Each hex pair gives you 8 binary bits (or 1 byte). That means 4 consecutive pairs of hex digits is 32 bits, which is one complete row of the stipple pattern. There are 32 rows in total.
It's worth noting that the example code has slightly confusing formatting. It's showing 8 hex pairs per line of source code. OpenGL doesn't care about that though. It just sees a contiguous array of numbers, which it splits into 32 bits per row.

Drawing bitmap fonts with OpenGL, what does glRasterPos2i() do?

This is another one of those "I have a blank screen, please help me fix it" moments.
This example is from The OpenGL Programming Guide, Version 2.1, Page 311-312.
The example is supposed to draw 2 lines of text on the screen.
Past of the problem I think is that I don't understand how glRasterPos2i() works. Does it:
A:) Set the position of bitmaps to be drawn in the 3D world in homogeneous / "OpenGL coordinates"
B:) Set the position of bitmaps to be drawn on the screen in pixel coordinates
Here is the code I have so far: You can pretty much ignore the first big lump which defines what the bitmaps are.
#include <GL/glut.h>
#include <cstdlib>
#include <iostream>
#include <cstring>
// This first bit is kind of irreverent, it sets up some fonts in memory as bitmaps
GLubyte space[] = {0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00 };
GLubyte letters[][13] = {
{ 0x00, 0x00, 0xc3, 0xc3, 0xc3, 0xc3, 0xff, 0xc3, 0xc3, 0xc3, 0x66, 0xc3, 0x18 },
{ 0x00, 0x00, 0xfe, 0xc7, 0xc3, 0xc3, 0xc7, 0xfe, 0xc7, 0xc3, 0xc3, 0xc7, 0xfe },
{ 0x00, 0x00, 0x7e, 0xe7, 0xc0, 0xc0, 0xc0, 0xc0, 0xc0, 0xc0, 0xc0, 0xe7, 0x7e },
{ 0x00, 0x00, 0xfc, 0xce, 0xc7, 0xc3, 0xc3, 0xc3, 0xc3, 0xc3, 0xc7, 0xce, 0xfc },
{ 0x00, 0x00, 0xff, 0xc0, 0xc0, 0xc0, 0xc0, 0xfc, 0xc0, 0xc0, 0xc0, 0xc0, 0xff },
{ 0x00, 0x00, 0xc0, 0xc0, 0xc0, 0xc0, 0xc0, 0xc0, 0xfc, 0xc0, 0xc0, 0xc0, 0xff },
{ 0x00, 0x00, 0x7e, 0xe7, 0xc3, 0xc3, 0xcf, 0xc0, 0xc0, 0xc0, 0xc0, 0xe7, 0x7e },
{ 0x00, 0x00, 0xc3, 0xc3, 0xc3, 0xc3, 0xc3, 0xff, 0xc3, 0xc3, 0xc3, 0xc3, 0xc3 },
{ 0x00, 0x00, 0x7e, 0x18, 0x18, 0x18, 0x18, 0x18, 0x18, 0x18, 0x18, 0x18, 0x7e },
{ 0x00, 0x00, 0x7c, 0xee, 0xc6, 0x06, 0x06, 0x06, 0x06, 0x06, 0x06, 0x06, 0x06 },
{ 0x00, 0x00, 0xc3, 0xc6, 0xcc, 0xd8, 0xf0, 0xe0, 0xf0, 0xd8, 0xcc, 0xc6, 0xc3 },
{ 0x00, 0x00, 0xff, 0xc0, 0xc0, 0xc0, 0xc0, 0xc0, 0xc0, 0xc0, 0xc0, 0xc0, 0xc0 },
{ 0x00, 0x00, 0xc3, 0xc3, 0xc3, 0xc3, 0xc3, 0xc3, 0xdb, 0xff, 0xff, 0xe7, 0xc3 },
{ 0x00, 0x00, 0xc7, 0xc7, 0xcf, 0xcf, 0xdf, 0xdb, 0xfb, 0xf3, 0xf3, 0xe3, 0xe3 },
{ 0x7e, 0xe7, 0xc3, 0xc3, 0xc3, 0xc3, 0xc3, 0xc3, 0xc3, 0xc3, 0xc3, 0xe7, 0x7e },
{ 0xc0, 0xc0, 0xc0, 0xc0, 0xc0, 0xfe, 0xc0, 0xf3, 0xc7, 0xc3, 0xc3, 0xc7, 0xfe },
{ 0x00, 0x00, 0x3f, 0x6e, 0xdf, 0xdb, 0xc3, 0xc3, 0xc3, 0xc3, 0xc3, 0x66, 0x3c },
{ 0x00, 0x00, 0xc3, 0xc6, 0xcc, 0xd8, 0xf0, 0xfe, 0xc7, 0xc3, 0xc3, 0xc7, 0xfe },
{ 0x00, 0x00, 0x7e, 0xe7, 0x03, 0x03, 0x07, 0x7e, 0xe0, 0xc0, 0xc0, 0xe7, 0x7e },
{ 0x00, 0x00, 0x18, 0x18, 0x18, 0x18, 0x18, 0x18, 0x18, 0x18, 0x18, 0x18, 0xff },
{ 0x00, 0x00, 0x7e, 0xe7, 0xc3, 0xc3, 0xc3, 0xc3, 0xc3, 0xc3, 0xc3, 0xc3, 0xc3 },
{ 0x00, 0x00, 0x18, 0x3c, 0x3c, 0x66, 0x66, 0xc3, 0xc3, 0xc3, 0xc3, 0xc3, 0xc3 },
{ 0x00, 0x00, 0xc3, 0xe7, 0xff, 0xff, 0xdb, 0xdb, 0xc3, 0xc3, 0xc3, 0xc3, 0xc3 },
{ 0x00, 0x00, 0xc3, 0x66, 0x66, 0xc3, 0xc3, 0x18, 0xc3, 0xc3, 0x66, 0x66, 0xc3 },
{ 0x00, 0x00, 0x18, 0x18, 0x18, 0x18, 0x18, 0x18, 0xc3, 0xc3, 0x66, 0x66, 0xc3 },
{ 0x00, 0x00, 0xff, 0xc0, 0xc0, 0x60, 0x30, 0x7e, 0x0c, 0x06, 0x03, 0x03, 0xff }
};
// This is just copying from the book
GLuint fontOffset;
void makeRasterFont()
{
GLuint i, j;
glPixelStorei(GL_UNPACK_ALIGNMENT, 1);
fontOffset = glGenLists(128);
for(i = 0, j = 'A'; i < 26; i ++, j ++)
{
glNewList(fontOffset + ' ', GL_COMPILE);
glBitmap(8, 13, 0.0, 2.0, 10.0, 0.0, letters[i]);
glEndList();
}
glNewList(fontOffset + ' ', GL_COMPILE);
glBitmap(8, 13, 0.0, 2.0, 10.0, 0.0, space);
glEndList();
}
void init()
{
glShadeModel(GL_FLAT);
makeRasterFont();
}
void printString(char* s)
{
glPushAttrib(GL_LIST_BIT);
glListBase(fontOffset);
glCallLists(std::strlen(s), GL_UNSIGNED_BYTE, (GLubyte*)s);
glPopAttrib();
}
void display()
{
GLfloat white[3] = {1.0, 1.0, 1.0 };
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glColor3fv(white);
// Print some text on the screen at (20,60) and (20,40)
glRasterPos2i(20, 60);
printString("THE QUICK BROWN FOX JUMPS");
glRasterPos2i(20, 40);
printString("OVER A LAZY DOG");
glFlush();
}
void reshape(int w, int h)
{
// Set the viewport
glViewport(0, 0, (GLsizei)w, (GLsizei)h);
// Set viewing mode
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
gluPerspective(45.0, (GLfloat)w / (GLfloat)h, 0.01, 100.0);
glMatrixMode(GL_MODELVIEW);
}
int main(int argc, char** argv)
{
/* Init glut with a single buffer display mode,
* window size, position and title */
glutInit(&argc, argv);
glutInitDisplayMode(GLUT_SINGLE | GLUT_RGB | GLUT_DEPTH);
glutInitWindowSize(500, 500);
glutInitWindowPosition(100, 100);
glutCreateWindow(argv[0]);
// Call init routine to set OpenGL specific initialization values
init();
// Set callback function
glutDisplayFunc(display);
glutReshapeFunc(reshape);
// Enter main loop
glutMainLoop();
return EXIT_SUCCESS;
}
Sorry for the type of question - I hate just asking "please fix my code", because really I should be able to fix it myself. On this occasion I find myself, stuck, basically. Thanks for you time and help.
Solution:
For those interested, to "get it to work", the changes made were:
1: Change gluPerspective to gluOrtho2D(0, width, 0, height).
2: Change glnewList(fontOffset + ' ', GL_COMPILE) to glnewList(fontOffset + j, GL_COMPILE) - not BOTH, just the FIRST ONE IN THE LOOP.
3: Set the glRasterPos2i to be anywhere within the region specified by glOrtho2D. My width and height are both 500, so I used coordinates (20, 60) and then (20, 40).
You could have just left it with gluPerspective, and used coordinates about (0,0) without specifying any transformations. However, since a bitmap is 2D I think this is less intuitive.
As to your rendering problem, hint, you don't use j...
In the for loop:
glNewList(fontOffset + ' ', GL_COMPILE);
replace your space with the letter you want.
The glRasterPos function specifies the raster position in object coordinates. Those are passed through the current modelview and projection matrices (at the time of the glRasterPos-call) to get the actual raster position in window (viewport) coordinates to be used for things like glDrawPixels and glBitmap (thus option A). So given your current perspective projection and identity modelview, those (20,40) (which are probably meant as pixels) are quite off the screen. If you want to specify it in pixels (which is usually the case), you need to setup your transformation pipeline accordingly.
But I wouldn't recommend using those old and deprecated (and likely slooow) pixel drawing functions at all (and neither to learn from the unfortunately awfully outdated Redbook). Just draw a textured quad with a custom shader that just takes window coordinates.

OpenGL and monochrome texture

Is it possible to pump monochrome (graphical data with 1 bit image depth) texture into OpenGL?
I'm currently using this:
glTexImage2D( GL_TEXTURE_2D, 0, 1, game->width, game->height, 0, GL_LUMINANCE, GL_UNSIGNED_BYTE, game->culture[game->phase] );
I'm pumping it with square array of 8 bit unsigned integers in GL_LUMINANCE mode (one 8 bit channel represents brightness of all 3 channels and full alpha), but it is IMO vastly ineffective, because the onlu values in the array are 0x00 and 0xFF.
Can I (and how) use simple one-bit per pixel array of booleans instead somehow? The excessive array size slows down any other operations on the array :(
After some research, I was able to render the 1-bit per pixel image as a texture with the following code:
static GLubyte smiley[] = /* 16x16 smiley face */
{
0x03, 0xc0, /* **** */
0x0f, 0xf0, /* ******** */
0x1e, 0x78, /* **** **** */
0x39, 0x9c, /* *** ** *** */
0x77, 0xee, /* *** ****** *** */
0x6f, 0xf6, /* ** ******** ** */
0xff, 0xff, /* **************** */
0xff, 0xff, /* **************** */
0xff, 0xff, /* **************** */
0xff, 0xff, /* **************** */
0x73, 0xce, /* *** **** *** */
0x73, 0xce, /* *** **** *** */
0x3f, 0xfc, /* ************ */
0x1f, 0xf8, /* ********** */
0x0f, 0xf0, /* ******** */
0x03, 0xc0 /* **** */
};
float index[] = {0.0, 1.0};
glPixelStorei(GL_UNPACK_ALIGNMENT,1);
glPixelMapfv(GL_PIXEL_MAP_I_TO_R, 2, index);
glPixelMapfv(GL_PIXEL_MAP_I_TO_G, 2, index);
glPixelMapfv(GL_PIXEL_MAP_I_TO_B, 2, index);
glPixelMapfv(GL_PIXEL_MAP_I_TO_A, 2, index);
glTexImage2D(GL_TEXTURE_2D,0,GL_RGBA,16,16,0,GL_COLOR_INDEX,GL_BITMAP,smiley);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
and here is the result:
The smallest uncompressed texture-format for luminance images uses 8 bits per pixel.
However, 1 bit per pixel images can be compressed without loss to the S3TC or DXT format. This will still not be 1 bit per pixel but somewhere between 2 and 3 bits.
If you really need 1 bit per pixel you can do so with a little trick. Load 8 1 bit per pixel textures as one 8 bit Alpha-only texture (image 1 gets loaded into bit 1, image 2 into bit 2 and so on). Once you've done that you can "address" each of the sub-textures using the alpha-test feature and a bit of texture environment programming to turn alpha into a color.
This will of only work if you have 8 1 bit per pixel textures and tricky to get right though.