Ok, before anyone marks this question as a duplicate, I have looked at What is an undefined reference/unresolved external symbol error and how do I fix it? and many other online posts. I've tried every solution I've come acoss but I still can't fix these errors:
1>SOIL.lib(stb_image_aug.o) : error LNK2019: unresolved external symbol __alloca referenced in function _stbi_zlib_decode_noheader_buffer
1>SOIL.lib(image_helper.o) : error LNK2019: unresolved external symbol _sqrtf referenced in function _RGBE_to_RGBdivA2
I'm using Visual Studio 2012 on Windows 8. I've tried rebuilding the library and I have quintuple checked all my includes and directories.Here are the SOIL includes/directory I have:
>Configuration Properties->VC++ Directories->Include Directories: "C:\SOIL\Simple OpenGL Image Library\src
>Configuration Properties->VC++ Directories->Library Directories: "C:\SOIL\Simple OpenGL Image Library\lib
>Congiguration Properties->Linker->Input->Additional Dependencies: "SOIL.lib"
And here's my code. The SOIL_load_image function is what's causing the errors:
#include <Windows.h>
#include <GL/glut.h>
#include "glext.h"
#include <SOIL.h>
void glEnable2D( void );
void display();
void glDisable2D( void );
GLuint tex;
/* Main function: GLUT runs as a console application starting at main() */
int main(int argc, char** argv)
{
glutInit(&argc, argv); // Initialize GLUT
glutCreateWindow("OpenGL Setup Test"); // Create a window with the given title
glutInitWindowSize(320, 320); // Set the window's initial width & height
glutInitWindowPosition(50, 50); // Position the window's initial top-left corner
//Texture
glGenTextures(1, &tex);
glBindTexture(GL_TEXTURE_2D, tex);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_BORDER);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_BORDER);
float color[] = { 1.0f, 0.0f, 0.0f, 1.0f };
glTexParameterfv(GL_TEXTURE_2D, GL_TEXTURE_BORDER_COLOR, color);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
int width, height;
unsigned char* image = SOIL_load_image("Resources/Sprites/playerCharacter.png", &width, &height, 0, SOIL_LOAD_RGB);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, width, height, 0, GL_RGB, GL_UNSIGNED_BYTE, image);
SOIL_free_image_data(image);
//Texture
glutDisplayFunc(display); // Register display callback handler for window re-paint
glutMainLoop(); // Enter the infinitely event-processing loop
return 0;
}
I've been working on this all day and I haven't been able to figure out what I'm doing wrong, and it's been extremely frustrating. Note: I know the original name of SOIL.lib is "libSOIL.a." I tried working with it with that name and got the same errors.
You can build the VC8 or VC9 solution files yourself in VS2012/13. Just make sure you are Win32 configuration if you are building a 32bit application. Change the configuration properties to x64 if you are building a 64bit application. Once the projects are built, link your openGL project to your these generated SOIL.lib files and you will be good to go.
Related
I have a project in C++ using opengl. I have main.cpp working with openGL initialization, creating window and stuff like that.
I would like to create a class, where I can outsource loading some texture into Shader.
But when I am trying to use shader.h or glad.h headers as include in any other class-headers, I got error:
fatal error C1189: #error: OpenGL header already included, remove
this include, glad already provides it
If I do all logic inside main.cpp everything is fine, problem raised only if trying to use openGL functions anywhere except main.cpp
main.cpp:
#include <glad/glad.h> // generated from https://glad.dav1d.de
#include "shader.h"
int main()
{
... //Do OpenGL Staff
}
shader.h:
#ifndef SHADER_H
#define SHADER_H
#include <glad/glad.h>
#include <glm/glm.hpp>
class Shader
{
public:
unsigned int ID;
...//some Shader definition staff
}
#endif
and now I want external class "Maze.h" to load it's map inside opengl texture, something like That:
class Maze
{
public:
...//some maze-relate staff
void LoadMazeToGL(Shader* shader)
{
// load and create a texture
// -------------------------
glGenTextures(1, &screenTex1);
glBindTexture(GL_TEXTURE_1D, screenTex1);
// set the texture wrapping parameters
glTexParameteri(GL_TEXTURE_1D, GL_TEXTURE_WRAP_S, GL_REPEAT); // set texture wrapping to GL_REPEAT (default wrapping method)
glTexParameteri(GL_TEXTURE_1D, GL_TEXTURE_WRAP_T, GL_REPEAT);
// set texture filtering parameters
glTexParameteri(GL_TEXTURE_1D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_1D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
GLint curMapId = glGetUniformLocation(shader->ID, "shaderInternalVarName");
glUniform1i(curMapId, 2); // Texture unit 2 is for current map.
... //define and fill tex1data using Maze private information
glActiveTexture(GL_TEXTURE0 + 2);
glBindTexture(GL_TEXTURE_1D, screenTex1);
glTexImage1D(GL_TEXTURE_1D, 0, GL_RGBA, MazeIntegerSize, 0, GL_RGBA, GL_UNSIGNED_BYTE, tex1data);
}
}
To check for problems like this use the -E flag of GCC/g++ or an alternative for your compiler. This outputs the result of the pre-processor (so the includes etc). You can then have a look at the files produced and look for the double include. You may have just forgotten an include guard somewhere :)
I have been having a very odd problem when trying to use OpenGL's C++ API. I am trying to load in a texture using ImageMagick, and then display it as a simple 2D textured square. I have a decent amount of experience with using OpenGL in Java, so I understand how to render a texture and bind it to a primitive. However, each time I attempt to draw it, the program either fails to render, or it renders it as a (properly sized) white square. I'm not entirely sure what is going on, but I believe it has to do with ImageMagick.
I have been using Ubuntu's terminal for compiling, and I've learned just how painful it can be to have to install libraries manually. ImageMagick first refused to compile when used in my program, and when I finally got the program to compile, it would seg-fault each time it ran. I've finally got it "working", but now, whenever I attempt to load in the image, the program will run without rendering. I haven't found anything like this on Google.
http://imgur.com/C7yKwDK
The odd thing is, very rarely, it will work correctly and render the square as expected. However, when I then try to rerun the program, it fails as shown above. I've determined that the line that causes it to fail to render is the same line the image is loaded, so that led me to believe that the image was just being loaded incorrectly, causing the program to fail. However, if I move the texture loading code before the creation of the GL window, the program will consistently render successfully, but the textured square appears only as white (though the size of the square is correct, so I know the image loading is working).
Anyway, sorry for the long post. I've just given up solving this one on my own, and was hoping one of you could help me out.
OpenGL Initialization Code:
Texture* tx;
void GraphicsOGL :: initialize3D(int argc, char* argv[]) {
Magick::InitializeMagick(*argv);
glutInit(&argc, argv);
//Loading Here ALWAYS Causes White Square
/*glEnable(GL_TEXTURE_2D);
tx = new Texture("Resources/Images/test.png");
tx->load();*/
glutInitDisplayMode(GLUT_DOUBLE|GLUT_RGBA);
glutInitWindowSize(SCREEN_WIDTH, SCREEN_HEIGHT);
glutInitWindowPosition(100, 100);
glutCreateWindow("OpenGL Game");
glViewport(0,0,SCREEN_WIDTH,SCREEN_HEIGHT);
glOrtho(0,SCREEN_WIDTH,SCREEN_HEIGHT,0, -3,1000);
glEnable(GL_DEPTH_TEST);
glEnable(GL_ALPHA_TEST);
glEnable(GL_TEXTURE_2D);
//Loading Here SOMETIMES Works, But Typically Fails
tx = new Texture("Resources/Images/test.png");
tx->load();
glutDisplayFunc(displayCallback);
glClearColor(0.0f, 0.0f, 0.0f, 0.0f);
glutMainLoop();
}
Texture Loading Code:
bool Texture::load() {
try {
m_image.read(m_fileName); //This Line Causes it to Fail to Render
m_image.write(&m_blob, "RGBA");
}
catch (Magick::Error& Error) {
std::cout << "Error loading texture '" << m_fileName << "': " << Error.what() << std::endl;
return false;
}
width = m_image.columns();
height = m_image.rows();
glGenTextures(1, &m_textureObj);
glBindTexture(m_textureTarget, m_textureObj);
//glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_BASE_LEVEL, 0);
//glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAX_LEVEL, 0);
glTexParameterf(m_textureTarget, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameterf(m_textureTarget, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT);
glTexParameteri(GL_TEXTURE_2D, GL_GENERATE_MIPMAP, GL_TRUE);
glTexImage2D(m_textureTarget, 0, GL_RGBA, width, height, 0, GL_RGBA, GL_UNSIGNED_BYTE, m_blob.data());
//glBindTexture(m_textureTarget, 0);
return true;
}
Texture Drawing Code:
void GraphicsOGL :: drawTexture(float x, float y, Texture* tex) {
glEnable(GL_TEXTURE_2D);
tex->bind();
float depth = 0, w, h;
w = tex->getWidth();
h = tex->getHeight();
glBegin(GL_QUADS);
glVertex3f(x, y+h, depth); glTexCoord2f(1,0);
glVertex3f(x+w, y+h, depth); glTexCoord2f(1,1);
glVertex3f(x+w, y, depth); glTexCoord2f(0,1);
glVertex3f(x, y, depth); glTexCoord2f(0,0);
glEnd();
}
I have a problem loading a texture using SDL library.
Usually I make programs on Linux but I try to create a code that is compatible with Visual Studio also.
On Linux are everything OK but on Visual Studio it crashes in "GL_UNSIGNED_SHORT_5_6_5" in the glTexImage2D(...) function.
Below is a general idea about what i want to do which I inspired by this tutorial:
#include "stdafx.h"
#include <stdlib.h>
#include <stdio.h>
#include <GL/glut.h>
//#include <GL/glext.h>
#include "SDL.h"
int brick;
float c=0.5;
float rx_min=0, ry_min=0;
float rx_max=1, ry_max=1;
unsigned int LoadTexture(const char* filename);
void DrawTexture(int object);
void setupmywindow();
void myDrawing();
void setupmywindow()
{
glClearColor(1.0,1.0,1.0,0);
glColor3f(0.0, 0.0, 0.0);
glPolygonMode(GL_FRONT_AND_BACK,GL_FILL);
gluOrtho2D(rx_min,ry_min, rx_max, ry_max);
brick = LoadTexture("brick.bmp");
}
void DrawTexture(int object)
{
glBindTexture(GL_TEXTURE_2D, object);
glColor3f(c,c,c);
glBegin(GL_QUADS);
glTexCoord2f(0., 1. );
glVertex2f( rx_min , ry_min );
glTexCoord2f(0., 0. );
glVertex2f( rx_min, ry_max );
glTexCoord2f(1., 0. );
glVertex2f( rx_max , ry_max );
glTexCoord2f(1., 1. );
glVertex2f( rx_max , ry_min );
glEnd();
}
unsigned int LoadTexture(const char* filename)
{
SDL_Surface* img=SDL_LoadBMP(filename);
unsigned int id;
glGenTextures(1, &id);
glBindTexture(GL_TEXTURE_2D,id);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, img->w, img->h, 0, GL_RGB, GL_UNSIGNED_SHORT_5_6_5, img->pixels);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
SDL_FreeSurface(img);
return id;
}
void myDrawing()
{
glClear(GL_COLOR_BUFFER_BIT);
DrawTexture(brick);
glFlush();
}
int main(int argc, char **argv)
{
printf("AUTH Computational Physics - Computer Graphics\n");
printf("Project >>TestTexture.cpp\n");
printf("--------------------------------------------------------\n");
glutInit(&argc, argv);
glutInitDisplayMode(GLUT_SINGLE|GLUT_RGB);
glutInitWindowPosition(50,50);
glutCreateWindow("Texture Test");
setupmywindow();
glutDisplayFunc(myDrawing);
glutMainLoop();
return 0;
}
The error is:
error C2065: 'GL_UNSIGNED_SHORT_5_6_5' : undeclared identifier
Here is the image that I try to load and it is configured as a bitmap (8bit 5 6 5) with GIMP 2.8
NOTE: When I uncoment #include < GL/glext.h > which is not needed on Linux, I get the above message:
Unhandled exception at 0x00d1193f in testTesxture.exe: 0xC0000005: Access violation reading location 0x00000014.
Generally if I save a bitmap image (for example with paint) how can I uderstand the type I have to put (GL_UNSIGNED_SHORT_5_6_5, GL_UNSIGNED_BYTE etc)?
The problem is likely that Windows uses an older version of OpenGL than Linux, and this old OpenGL version does not have that specific identifier (and others, I'm sure). To get around this and any other possible version problems, I would use GLEW which does the hard work for you.
In windows, add this line after the includes:
#ifndef GL_UNSIGNED_SHORT_5_6_5
#define GL_UNSIGNED_SHORT_5_6_5 0x8363
#endif
#ifndef GL_CLAMP_TO_EDGE
#define GL_CLAMP_TO_EDGE 0x812F
#endif
According to this video.
When I use GL3W, I cannot load textures. They appear blank or messed up.
I wanted an OpenGL 4.2 context in SDL 1.3 and so I decided to use GL3W, as GLEW used deprecated functions. Everything seems to work fine, however, when I try to load a texture, it either gets messed up (randomly colored lines) or simply ends up blank (black without alpha, transparent with). Everything else I've tried so far has worked (shaders, VAO's, VBO's, etc.)
I wrote the most simple example I could come up with to illustrate:
#include <SDL.h>
#include <SDL_image.h>
#include <GL3/gl3w.h>
#include <gl/gl.h>
#include <gl/glu.h>
int main(int argc, char* argv[]) {
SDL_Init(SDL_INIT_EVERYTHING);
SDL_WindowID mainWindow = SDL_CreateWindow("Test", SDL_WINDOWPOS_CENTERED, SDL_WINDOWPOS_CENTERED, 800, 600, SDL_WINDOW_OPENGL | SDL_WINDOW_SHOWN);
SDL_GL_SetAttribute(SDL_GL_CONTEXT_MAJOR_VERSION, 4);
SDL_GL_SetAttribute(SDL_GL_CONTEXT_MINOR_VERSION, 2);
SDL_GLContext mainContext = SDL_GL_CreateContext(mainWindow);
SDL_GL_MakeCurrent(mainWindow, mainContext);
gl3wInit();
GLuint id;
glGenTextures(1, &id);
glBindTexture(GL_TEXTURE_2D, id);
SDL_Surface* test2 = IMG_Load("test.png");
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, test2->w, test2->h, 0, GL_RGBA, GL_UNSIGNED_BYTE, test2->pixels);
// Loop to keep the window open while debugging in gDEBugger
while (1) {
SDL_GL_SwapWindow(mainWindow);
}
return 0;
}
I don't know how relevant it is, but since gl3w is generated by a python script I'll include it (external links because of length):
gl3w.c: http://pastebin.com/T5GNdJL8
gl3w.h: http://pastebin.com/yU2EzBZP
gl3.h: http://pastebin.com/0uCgB8d1
If I remove #include <GL3/gl3w.h> and glewInit(); the texture is successfully loaded.
hi everyone ive got quite an error here it seems like c++ is not finding glActiveTextureARB(GL_TEXTURE0_ARB);
im using codeblocks and i've got glext.h so whenever i right click glActiveTextureARB and find declaration it actually finds it... ive got a 64bits system and ive tried putting glext.h in the GL folder and also in my project and im getting the same error any ideas would help tyvm
heres my code in case u need it.. it is in spanish btw but it doesnt matter cuz the error i think its not in the code
#include "objetos.h"
#include "glext.h"
#include <cassert>
Objetos::Objetos()
{
m_OBJS = NULL;
}
Objetos::Objetos(OBJETO d,int txt)
{
m_OBJS = NULL;
box = 0;
triangle = 0;
circle = 0;
CTargaImage image;
image.Load("TGAs/caja1.tga");
glGenTextures(1, &m_texturaCaja[0]);
glBindTexture(GL_TEXTURE_2D, m_texturaCaja[0]);
gluBuild2DMipmaps(GL_TEXTURE_2D, GL_RGB,
image.GetWidth(), image.GetHeight(),
GL_RGB, GL_UNSIGNED_BYTE, image.GetImage());
image.Release();
image.Load("TGAs/caja2.tga");
glGenTextures(1, &m_texturaCaja[1]);
glBindTexture(GL_TEXTURE_2D, m_texturaCaja[1]);
gluBuild2DMipmaps(GL_TEXTURE_2D, GL_RGB,
image.GetWidth(), image.GetHeight(),
GL_RGB, GL_UNSIGNED_BYTE, image.GetImage());
image.Release();
switch(d)
{
case TRIANGULO:
//borrarlo antes de dibujarlo siempre;
glActiveTextureARB(GL_TEXTURE0_ARB);
glBindTexture(GL_TEXTURE_2D, m_texturaTriangulo[txt]);
glEnable(GL_TEXTURE_2D);
glTexEnvf (GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_COMBINE_EXT);
glTexEnvf (GL_TEXTURE_ENV, GL_COMBINE_RGB_EXT, GL_REPLACE);
m_OBJS = glmReadOBJ("materiales/triangulo.obj");
m_Posicion.x = 0.0f;
glDisable(GL_TEXTURE_2D);
break;
case CIRCULO:
glActiveTextureARB(GL_TEXTURE1_ARB);
glBindTexture(GL_TEXTURE_2D, m_texturaEsfera[2]);
glEnable(GL_TEXTURE_2D);
glTexEnvf (GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_COMBINE_EXT);
glTexEnvf (GL_TEXTURE_ENV, GL_COMBINE_RGB_EXT, GL_REPLACE);
m_OBJS = glmReadOBJ("materiales/circulo.obj");
m_Posicion.x = -0.43f;
glDisable(GL_TEXTURE_2D);
break;
case CAJA:
glActiveTextureARB(GL_TEXTURE2_ARB);
glBindTexture(GL_TEXTURE_2D, m_texturaCaja[1]);
glEnable(GL_TEXTURE_2D);
glTexEnvf (GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_COMBINE_EXT);
glTexEnvf (GL_TEXTURE_ENV, GL_COMBINE_RGB_EXT, GL_REPLACE);
m_OBJS = glmReadOBJ("materiales/caja.obj");
m_Posicion.x = 0.43f;
glDisable(GL_TEXTURE_2D);
break;
}
}
glActiveTextureARB is an extension function. As such under the Windows plattform it does not suffice to include glext.h to make it usable. You also have to define a function pointer and load it with
PFNGLGETACTIVETEXTUREARB __myglextActiveTextureARB;
#define glActiveTextureARB __myglextActiveTextureARB
void initGLextensions() {
__myglextActiveTextureARB = (PFNGLGETACTIVETEXTUREARB) wglGetProcAddress("glActiveTextureARB);
}
That macro juggling is neccesary to keep the library namespace clean.
Since it would be so tedious doing all this extension loading from scratch there are extension wrapper libraries like GLEW ( http://glew.sourceforge.net ) or GLEE ( http://www.opengl.org/sdk/libs/GLee/ ) reducing the whole process into including their headers instead of the standard OpenGL includes, adding it to the linked libraries list and doing a glewInit() for GLEW and for GLEE optionally a GLeeInit() after context creation and be done with.