Unable to change OpenGL version on OSX - c++

I'm using FreeGLUT and I would like to use OpenGL 3.2. I'm on MacBook Pro (15", Early 2015) Mojave 10.14.3 which according to Apple documentation supports OpenGL up to 4.1. I have downloaded freeglut with brew. However no matter what I put into glutInitContextVersion, the version does not change. No matter what I do, the OpenGL version stays 2.1.
Intel Iris Pro OpenGL Engine, 2.1 INTEL-12.4.7
The used code looks like this:
#include <iostream>
#include <GL/freeglut.h>
using namespace std;
void displayFunc() {
glClearColor(1.0, 1.0, 0.5, 0.5);
glClear(GL_COLOR_BUFFER_BIT);
glMatrixMode(GL_MODELVIEW);
glFlush();
}
int main(int argc, char * argv[]) {
glutInit(&argc, argv);
glutInitContextVersion(3,2);
glutInitContextFlags(GLUT_FORWARD_COMPATIBLE);
glutInitContextProfile(GLUT_CORE_PROFILE);
glutInitWindowPosition(10, 10);
glutInitWindowSize(480, 272);
glutInitDisplayMode(GLUT_RGBA | GLUT_SINGLE);
glutCreateWindow("Window");
glutDisplayFunc(displayFunc);
// Print the OpenGL version
std::printf("%s\n%s\n", glGetString(GL_RENDERER), glGetString(GL_VERSION));
glutMainLoop();
return 0;
}
My cmake looks like this:
cmake_minimum_required(VERSION 3.3)
project(AVT7)
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -std=c++11")
set(CMAKE_EXE_LINKER_FLAGS "${CMAKE_EXE_LINKER_FLAGS} -framework GLUT -framework OpenGL -framework Cocoa")
include_directories(/usr/local/Cellar/freeglut/3.0.0/include)
link_directories(/usr/local/Cellar/freeglut/3.0.0/lib)
add_executable(AVT7 hello.cpp)
target_link_libraries(AVT7 glut)
How can I get this working? I am aware that there might be problem by Apple not supporting OpenGL any more, however it should still work (especially with FreeGLUT).

Related

Why won't this OpenGL program open a window?

I am currently trying to learn both C++ and OpenGL, and I am beginning with the following code example:
simple.cpp
#define GL_SILENCE_DEPRECATION
#include <GLUT/glut.h>
void init() {
// code to be inserted here
}
void mydisplay() {
glClear(GL_COLOR_BUFFER_BIT);
// need to fill in this part
// and add in shaders
}
int main(int argc, char** argv) {
glutCreateWindow("simple");
init();
glutDisplayFunc(mydisplay);
glutMainLoop();
}
It says here that this code can be compiled on MacOS using the following command:
gcc -Wno-deprecated-declarations -o hello hello.c -framework GLUT -framework OpenGL -framework Carbon
So I adapt the command in the following way:
gcc -Wno-deprecated-declarations -o simple simple.cpp -framework GLUT -framework OpenGL -framework Carbon
This seems to work and creates an executable called simple.
I am told that this code should generate a white square on a black background as follows:
However, when I try to run this file from the terminal using ./simple, the program runs continuously but nothing happens (that is, no window is generated at all), and so I have to terminate the program from the terminal.
Did I do something wrong here, or is this expected for the given code on MacOS?
EDIT1
To see what would happen, I tried to use the code presented in the aforementioned guide:
hello.c
#include <OpenGL/gl.h>
#include <OpenGL/glu.h>
#include <GLUT/glut.h>
void display()
{
}
int main(int argc, char **argv)
{
glutInit(&argc, argv);
glutDisplayFunc(display);
glutMainLoop();
}
As the guide says, this is "a simple OpenGL program that does nothing".
I compile it as follows:
gcc -Wno-deprecated-declarations -o hello hello.c -framework GLUT -framework OpenGL -framework Carbon
This compiles fine. However, when I try to run the executable, I get the following error:
GLUT Fatal API Usage: main loop entered with no windows created.
zsh: abort ./hello
According to this error message, the program was terminated because no windows were created. However, as I said, my simple.cpp program does not terminate (I have to forcefully terminate it), and it is written to create windows. So I'm guessing this means that windows are created in simple.cpp, but for some reason they just don't show up on MacOS? Can anyone else confirm this? Is the issue perhaps that blank windows don't show up on MacOS, and you need to include other graphical elements in order to make the window show?
EDIT2
The problem is that my understanding is that glutCreateWindow("simple"); in simple.cpp is supposed to create a window with "simple" as its title, so I don't understand why it isn't working.
EDIT3
Thanks to derhass's answer, I was able to make it work:
#define GL_SILENCE_DEPRECATION
#include <GLUT/glut.h>
void init() {
// code to be inserted here
}
void mydisplay() {
glClear(GL_COLOR_BUFFER_BIT);
// need to fill in this part
// and add in shaders
}
int main(int argc, char** argv) {
glutInit(&argc, argv);
glutCreateWindow("simple");
init();
glutDisplayFunc(mydisplay);
glutMainLoop();
}
YOu did not include the call to glutInit() into your simple.cpp example. You can't call any other GLUT commands before this intialization, and doing so results in undefined behavior.

OpenGL program compiles but doesn't start correctly

I'm trying out OpenGL and C++, and I followed this video tutorial on writing my program (my code is exactly the same as his). I also followed the instructions on the freeglut website here to set up freeglut, compile, and link my program. The source code compiles with no problem, but when I try running the exe I get an error. The only reason I could think of is that I'm not using an IDE, so I'm probably missing some compilation steps or missing some command line arguments when running the exe, which the IDE would have done automatically. Can someone tell me what I need to do to run my program correctly?
Here's my code:
#include <GL/glut.h>
void init() {
glClearColor(1.0, 1.0, 0.0, 1.0);
}
void display() {
glClear(GL_COLOR_BUFFER_BIT);
glLoadIdentity();
glFlush();
}
int main(int argc, char **argv) {
glutInit(&argc, argv);
glutInitDisplayMode(GLUT_RGB);
glutInitWindowPosition(200, 100);
glutInitWindowSize(500, 500);
glutCreateWindow("Window 1");
glutDisplayFunc(display);
init();
glutMainLoop();
}
When I compile I run
gcc -c -o hello.o hello.cpp -I"C:\MinGW\include"
gcc -o hello.exe hello.o -L"C:\MinGW\lib" -lfreeglut -lopengl32 -Wl,--subsystem,windows
Then I try to run hello.exe but I only get an error message "The application was unable to start correctly (0xc000007b)".
BTW I saw this duplicate question but I've tried putting the dll in the same directory (it was there from the start) but that didn't change anything.
Using the 32 bit freeglut dll (instead of the 64 bit dll) in my project fixed the problem.

Freeglut error: fgInitGL2: fghGenBuffers is NULL

I'm transferring a program from OSX to Windows, but one error is still nagging me. The error occurs during run-time in gdb. Compiling and linking goes all fine.
freeglut (C:\path\to\file.exe): fgInitGL2: fghGenBuffers is NULL
Outside the GDB environment it gives an APPCRASH (windows-shell) or Segmentation fault (mingw64-shell).
My linker flags are:
-std=c++11 -lstdc++ -lz -lm -lmysqlcleint -lpthread -lboost_thread-mgw49-mt-d-1_57 -lboost_system-mgw49-mt-d-1_57 -lboost_regex-mgw49-mt-d-1_57 -lcurl -lfreeglut -lglu32 -lopengl32 -lws2_32 -lwsock32 -U__CYGWIN__
I'm working in msys2 mingw-w64. During runtime, the program tries to open a new window (at least a pictogram rices in the windows taskbar), but the construction of the window won't succeed. The program runs fine on OSX, where I use glut instead of freeglut.
Header (amongst others):
#include <direct.h>
#include <GL/glut.h>
#include <GL/freeglut.h>
CPP (amongst others):
void interface::startInterface(int &argc, char **argv){
glutInitDisplayMode(GLUT_RGBA | GLUT_DOUBLE | GLUT_DEPTH);
glutInitWindowSize (width, height);
glutInitWindowPosition (1920, 0);
glutInit (&argc, argv);
glutCreateWindow ("TIFAR 2.0");
LoadGLTextures(); // Load The Texture(s) ( NEW )
glClearColor(0.0f, 0.0f, 0.0f, 0.0f); // This Will Clear The Background Color To Black
glClearDepth(1.0); // Enables Clearing Of The Depth Buffer
glDepthFunc(GL_LESS); // The Type Of Depth Test To Do
glEnable(GL_DEPTH_TEST); // Enables Depth Testing
glEnable(GL_BLEND);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
glShadeModel(GL_SMOOTH); // Enables Smooth Color Shading
glMatrixMode(GL_PROJECTION);
glLoadIdentity(); // Reset The Projection Matrix
gluPerspective(45.0f, (GLfloat) width / (GLfloat) height, 0.1f, 100.0f);
// Calculate The Aspect Ratio Of The Window
glMatrixMode(GL_MODELVIEW);
glutDisplayFunc (interface::display);
glutReshapeFunc (interface::reshape);
glutIdleFunc (interface::idle);
glutKeyboardFunc (interface::processNormalKeys);
glutMainLoop();
}
There are some other parts in the program, like where images are loaded, but I think it will be to much information when I mention everything here.
It took me some time, but the cause of the problem was in the hardware. I was running on a virtual machine (VMware) and although the specifications said that it supported OpenGL up to 2.1 I found out that it doesn't support OpenGL at all.
My solution was to take an old machine, install Windows on it, and copy all files. It compiled and runs as smooth as a can.
If anyone else runs into the same problem I can advise to get it working on a native installation before virtualising. It can safe you a lot of time.

OpenGL project is requiring DirectX lib files

I have glut version 3.7 installed, running Windows 7 and using VS 2010:
I seem unable to run any C++ programs without it saying it requires Direct X libraries and includes in the VC++ Directories properties tab.
Under propertes>input>additional dependencies it shows dxerr.lib and a few other dx libraries under inherited values which I believe is the cause of this error. How can I remove these values? Unless anyone believes the error originates elsewhere...
#include <stdlib.h>
#include <GL\glut.h>
int main(int argc, char **argv) {
glutInit(&argc, argv);
glutInitDisplayMode(GLUT_DEPTH | GLUT_DOUBLE | GLUT_RGBA);
glutInitWindowPosition(100, 100);
glutInitWindowSize(720, 480);
glutCreateWindow("First OpenGL Project");
return 1;
}
Error 1 error LNK1104: cannot open file 'dxerr.lib' c:\Users\mallaboro\documents\visual studio 2010\Projects\First OpenGL Project2\First OpenGL Project2\LINK First OpenGL Project2
Try a maintained GLUT implementation instead of nearly twenty-year-old software.

what SDL and OpenGL version and implementation I'm using

I downloaded SDL 1.2.14
on Windows 7
and I have Mobility Radeon X1800 driver installed.
I'm using Microsoft Visual C++ 2010 Express.
I added the SDL include and library directories in the "VC++ Directories"
I added the following Additional Dependencies:
opengl32.lib;
glu32.lib;
SDL.lib;
SDLmain.lib;
I added the SDL.dll to my program folder
I didn't add any opengl directories!
#include "SDL.h"
#include "SDL_opengl.h"
bool running = true;
int main(int argc, char* args[]) {
SDL_Init(SDL_INIT_EVERYTHING);
SDL_Surface* screen = SDL_SetVideoMode(640,480,32,SDL_OPENGL);
glViewport(0,0,640,480);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
gluPerspective(45.0, 640/480, 1.0, 200.0);
while(running) {
glClear(GL_COLOR_BUFFER_BIT |GL_DEPTH_BUFFER_BIT);
glMatrixMode(GL_MODELVIEW); // Swich to the drawing perspective
glLoadIdentity();
glTranslatef(0.0,0.0,-5.0);
glBegin(GL_TRIANGLES);
glVertex3f(-0.5f, 0.5f, 0.0f);
glVertex3f(-1.0f, 1.5f, 0.0f);
glVertex3f(-1.5f, 0.5f, 0.0f);
glEnd();
SDL_GL_SwapBuffers();
}
SDL_Quit();
return 0;
}
This program draws a simple triangle.
I include 2 header files above and my Opengl code just works!
I don't know if my triangle is done on a the GPU or CPU. And what openGL version I'm using?
I mean i heard that Microsoft don't update there opengl files any longer and that they use CPU implementation of OpenGL 1.1 or something.
How do I know what version of OpenGL I'm using? And can I check at run time?
How do I know if I'm using a CPU or GPU implementation? And can I check at run time?
Thanks for look at my problem.
call glGetString
Here is Microsoft's documentation for glGetString. It just repeats the SGI doc and tells you the function is found in gl.h and opengl32.lib.
Actually when you install your video card driver it "replaces" the opengl existing in your machine, so you will be using that version.
Multiple versions of OpenGL are present at the same time, and which one is used depends on the HDC used to initialize OpenGL. For example, applications running in the local login session can get hardware-accelerated GL while those running in a remote desktop session get the CPU-based implementation ( Ben Voigt )
The currently header and lib that comes with Visual Studio only has OpenGL 1.1 in it, so to access more modern stuff you need to call the wglGetProcAddress to get pointers to the new functions.
Here you can find more information: http://www.opengl.org/wiki/Getting_started