Problems Loading 32 bit .bmp texture in opengl c++ - c++

I'm having some trouble loading a 32 bit .bmp image in my opengl game. So far i can load and display 24 bit perfectly. I now want to load a bit map with portions of its texture being transparent or invisible.
This function has no problems with 24 bit. but 32 bit with alpha channel .bmp seem to distort the colors and cause transparently in unintended places.
Texture LoadTexture( const char * filename, int width, int height, bool alpha)
{
GLuint texture;
GLuint* data;
FILE* file;
fopen_s(&file, filename, "rb");
if(!file)
{
std::cout << filename <<": Load Texture Fail!\n";
exit(0);
}
data = new GLuint[width * height];
fread( data, width * height * sizeof(GLuint), 1, file );
fclose(file);
glGenTextures( 1, &texture );
glBindTexture( GL_TEXTURE_2D, texture);
glTexEnvf( GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_MODULATE);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT);
if(alpha) //for .bmp 32 bit
{
glPixelStorei(GL_UNPACK_ALIGNMENT, 1);
glTexImage2D(GL_TEXTURE_2D, 0, 4, width, height, 0, GL_RGBA, GL_UNSIGNED_BYTE, data);
}
else //for .bmp 24 bit
{
glPixelStorei(GL_UNPACK_ALIGNMENT, 4);
glTexImage2D(GL_TEXTURE_2D, 0, 3, width, height, 0, GL_BGR, GL_UNSIGNED_BYTE, data);
}
std::cout<<"Texture: "<<filename<< " loaded"<<std::endl;
delete data;
return Texture(texture);
}
In Game Texture, drawn on a flat plane
this might look like its working but the 0xff00ff color is the one that should be transparent. and if i reverse the alpha channel in photoshop the result is the same the iner sphere is always transparent.
i also enabled:
glEnable(GL_BLEND);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
there is no problems with transparency the problems seems to be with loading the bitmap with an alpha channel. also all bit maps that I load seem to be off a bit to the right. Just wondering if there was a reason for this?

I'm going to answer this on the assumption that your file "parsing" is in-fact correct. That the file data is just the image part of a .BMP without any of the header information.
glPixelStorei(GL_UNPACK_ALIGNMENT, 1);
glTexImage2D(GL_TEXTURE_2D, 0, 4, width, height, 0, GL_RGBA, GL_UNSIGNED_BYTE, data);
glPixelStorei(GL_UNPACK_ALIGNMENT, 4);
glTexImage2D(GL_TEXTURE_2D, 0, 3, width, height, 0, GL_BGR, GL_UNSIGNED_BYTE, data);
I find it curious that your 3-component data is in BGR order, yet your 4-component data is in RGBA order. Especially if this data comes from a .BMP file (though they don't allow alpha channels). Are you sure that your data isn't in a different ordering? For example, perhaps ABGR order?
Also, stop using numbers for image formats. You should use a real, sized internal format for your textures, not "3" or "4".

So code that i use work for 100% for me compiled by visual c 2012
glBindTexture(GL_TEXTURE_2D, texture_id);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT);
glTexEnvf( GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_MODULATE); //Thia is very important!!!!
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, imag_ptr, ptr->image->height, 0,GL_RGBA, GL_UNSIGNED_BYTE, imag_ptr);
and than in render i use
glPushMatrix();
glTranslatef(0,0,0);
glEnable(GL_TEXTURE_2D);
glBindTexture(GL_TEXTURE_2D, 1);
glEnable(GL_BLEND);
glColor4ub(255,255,255,255); //This is veryveryvery importent !!!!!!!!!!! (try to play and you see)
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
glBegin(GL_QUADS);
glTexCoord3d(1, 1, 0);
glVertex2f(8,8);
glTexCoord3d(0, 1, 0);
glVertex2f(-8,8);
glTexCoord3d(0, 0, 0);
glVertex2f(-8,-8);
glTexCoord3d(1, 0, 0);
glVertex2f(8,-8);
glEnd();
glEnd();
glDisable(GL_TEXTURE_2D);
glPopMatrix();
and i use for example android png icon and another i try to post another image but i have no peputation for this so if you want i send it to you
So all of this in png format
but in bmp format and tga you must swap colors from ARGB to RGBA without this its not working
for( x=0;x<bmp_in_fheader.width;x++)
{
t=pixel[i];
t1=pixel[i+1];
t2=pixel[i+2];
t3=pixel[i+3];
pixel_temp[j]=t2;
pixel_temp[j+1]=t1;
pixel_temp[j+2]=t;
pixel_temp[j+3]=t3;
i+=4;
j+=4;
}
==Next==
To crate them in photoshop you must delete your background and draw on new layer than add alpha layer
in channels REMEMBER !! Very important to that in alpha all black color is represent transparency
and your image must be under white color only

Related

Using glGetTexImage() to aquire OpenCV Mat

Im trying to make an OpenCV Mat() using output from OpenGL's glGetTexImage(). The texture I am trying to get information from was made using the call;
glTexImage2D( GL_TEXTURE_2D, 0, GL_RGBA8UI, iWidth, iHeight, 0, GL_RGBA_INTEGER, GL_UNSIGNED_BYTE, pImageData);
and so I've tried to do this using;
unsigned char* texture_bytes = (unsigned char*)malloc(sizeof(unsigned char)*texture_width*texture_height * 3);
glGetTexImage(GL_TEXTURE_2D, 0, GL_BGR, GL_UNSIGNED_BYTE, texture_bytes);
Matrix = Mat(texture_height, texture_width, CV_8UC3, texture_bytes);
What I am wondering is If anyone knows what I should set the format and type of glGetTexImage() to in order for this to work. Also, what should i set the type of the Mat() to?
You can assume that the context is set correctly, and that the texture that is input is correct. I have verified this by displaying the texture on screen using OpenGL. Thanks in advance!
I have been faced with the problem of getting data from OpenGL to OpenCV recently. I didn't use glGetTexImage though.
What I did was an offscreen render in a framebuffer with a texture initialized like this:
GLuint texture;
if (glIsTexture(texture)) {
glDeleteTextures(1, &texture);
}
glGenTextures(1, &texture);
glBindTexture(GL_TEXTURE_2D, texture);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
glBindTexture(GL_TEXTURE_2D, 0);
glBindTexture(GL_TEXTURE_2D, texture);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA32F, width, height, 0, GL_RGBA, GL_FLOAT, 0);
glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, texture, 0);
glBindTexture(GL_TEXTURE_2D, 0);
Then after my draw calls, I get the data using glReadPixels:
glPixelStorei(GL_PACK_ALIGNMENT, 1);
glReadBuffer(GL_COLOR_ATTACHMENT0);
cv::Mat texture = cv::Mat::zeros(height, width, CV_32FC3);
glReadPixels(0, 0, width, height, GL_BGR, GL_FLOAT, texture.data);
Hope it helps you.
You have a mismatch in the format parameter used for the glGetTexImage() call and the internal format of the texture:
glTexImage2D( GL_TEXTURE_2D, 0, GL_RGBA8UI, iWidth, iHeight, 0, GL_RGBA_INTEGER, GL_UNSIGNED_BYTE, pImageData);
...
glGetTexImage(GL_TEXTURE_2D, 0, GL_BGR, GL_UNSIGNED_BYTE, texture_bytes);
For an integer texture, which you have in this case, you need to use a format parameter to glGetTexImage() that works for integer textures. In your example, that would be:
glGetTexImage(GL_TEXTURE_2D, 0, GL_BGR_INTEGER, GL_UNSIGNED_BYTE, texture_bytes);
It is always a good idea to call glGetError() if you have any kind of problem getting the desired OpenGL behavior. In this case, you would have gotten a GL_INVALID_OPERATION error, based on this error condition in the spec:
format is one of the integer formats in table 8.3 and the internal format of the texture image is not integer, or format is not one of the integer formats in table 8.3 and the internal format is integer.

C++ Adding a texture to a GL_QUAD and it's coming out black

I have a series of rectangles of different colours and I'm trying to add a texture to one of them. However when I apply the texture to the given rectangle, it just turns black. Below is the function I use to load the texture.
GLuint GLWidget:: LoadTexture(const char * pic, int width, int height){
GLuint Texture;
BYTE * data;
FILE * picfile;
picfile = fopen(pic, "rb");
if (picfile == NULL)
return 0;
data = (BYTE *)malloc(width * height * 3);
fread(data, width * height, 3, picfile);
fclose(picfile);
glGenTextures(1, &Texture);
glBindTexture(GL_TEXTURE_2D, Texture);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT);
glTexParameteri (GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT);
glTexParameteri (GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri (GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB8, width, height, 0, GL_RGB8, GL_UNSIGNED_BYTE, data);
return Texture;
}
In another function where the GL_QUADS are drawn, I then have...
GLuint myTex = LoadTexture("texture.bmp", 500, 500);
glEnable(GL_TEXTURE_2D);
glTexEnvf(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_MODULATE);
glBindTexture(GL_TEXTURE_2D, myTex);
glBegin(GL_QUADS);
glTexCoord2f(1, 1); glVertex3f(42, 10, 42);
glTexCoord2f(1, 0); glVertex3f(42, 10, -42);
glTexCoord2f(0, 0); glVertex3f(-42,10,-42);
glTexCoord2f(0, 1); glVertex3f(-42,10, 42);
glEnd();
If anyone could let me know where I am going wrong that would be great, thanks!
This call
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB8, width, height, 0, GL_RGB8, GL_UNSIGNED_BYTE, data);
is invalid. GL_RGB8 is a valid internalFormat, but it is not a valid enum for format. Use GL_RGB, GL_UNSIGNED_BYTE as format and type if your client-side data is 3 channels with 8 but unsigned int data per channel.
Another thing is
LoadTexture("texture.bmp", 500, 500);
This suggests that you are dealing with BMP files, but your loader only deals with completely raw image data.

Creating and reading 1D textures in OpenGL 4.x

I have problems to use 1D textures in OpenGL 4.x.
I create my 1d texture this way (BTW: I removed my error checks to make the code more clear and shorter - usually after each gl call a BLUE_ASSERTEx(glGetError() == GL_NO_ERROR, "glGetError failed."); follows):
glGenTextures(1, &textureId_);
// bind texture
glBindTexture(GL_TEXTURE_1D, textureId_);
// tells OpenGL how the data that is going to be uploaded is aligned
glPixelStorei(GL_UNPACK_ALIGNMENT, 1);
BLUE_ASSERTEx(description.data, "Invalid data provided");
glTexImage1D(
GL_TEXTURE_1D, // Specifies the target texture. Must be GL_TEXTURE_1D or GL_PROXY_TEXTURE_1D.
0, // Specifies the level-of-detail number. Level 0 is the base image level. Level n is the nth mipmap reduction image.
GL_RGBA32F,
description.width,
0, // border: This value must be 0.
GL_RGBA,
GL_FLOAT,
description.data);
BLUE_ASSERTEx(glGetError() == GL_NO_ERROR, "glGetError failed.");
// texture sampling/filtering operation.
glTexParameteri(GL_TEXTURE_1D, GL_TEXTURE_WRAP_S, GL_REPEAT);
glTexParameteri (GL_TEXTURE_1D, GL_TEXTURE_WRAP_T, GL_REPEAT);
glTexParameteri (GL_TEXTURE_1D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri (GL_TEXTURE_1D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glBindTexture(GL_TEXTURE_1D, 0);
After the creation I try to read the pixel data of the created texture this way:
const int width = width_;
const int height = 1;
// Allocate memory for depth buffer screenshot
float* pixels = new float[width*height*sizeof(buw::vector4f)];
// bind texture
glBindTexture(GL_TEXTURE_1D, textureId_);
glPixelStorei(GL_PACK_ALIGNMENT, 1);
glReadPixels(0, 0, width, height, GL_RGBA, GL_FLOAT, pixels);
glBindTexture(GL_TEXTURE_1D, 0);
buw::Image_4f::Ptr img(new buw::Image_4f(width, height, pixels));
buw::storeImageAsFile(filename.toCString(), img.get());
delete pixels;
But the returned pixel data is different to the input pixel data (input: color ramp, ouptut: black image)
Any ideas how to solve the issue? Maybe I am using a wrong API call.
Replacing glReadPixels by glGetTexImage does fix the issue.

CubeMap in OpenGL doesn't show at all

I'm trying to simply load 6 pictures as texture of cube in OpenGL. Blow is the loading code:
glTexParameteri(GL_TEXTURE_CUBE_MAP_ARB, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_CUBE_MAP_ARB, GL_TEXTURE_MIN_FILTER, GL_LINEAR_MIPMAP_LINEAR);
glTexParameteri(GL_TEXTURE_CUBE_MAP_ARB, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_CUBE_MAP_ARB, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_CUBE_MAP_ARB, GL_TEXTURE_WRAP_R, GL_CLAMP_TO_EDGE);
for (int i = 0;i < 6;++i)
{
int width, height, channel;
unsigned char* img = SOIL_load_image(skybox[i].c_str(), &width, &height, &channel, SOIL_LOAD_AUTO);
glTexImage2D(cubeTarget[i], 0, GL_RGB, width, height, 0, GL_RGB, GL_UNSIGNED_BYTE, img);
delete img;
}
glEnable(GL_TEXTURE_CUBE_MAP_ARB);
The rendering code is passed. The weird thing is that the cube rendered is white. It seems that the texture isn't loaded at all. I change the loading code to see whether a 2D texture will work:
glGenTextures(1, texture);
glBindTexture(GL_TEXTURE_2D, texture[0]);
int width, height, channel;
unsigned char* img = SOIL_load_image(skybox[0].c_str(), &width, &height, &channel, SOIL_LOAD_AUTO);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, width, height, 0, GL_RGB, GL_UNSIGNED_BYTE, img);
delete img;
if(texture[0] == 0) return false;
glBindTexture(GL_TEXTURE_2D, texture[0]);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glEnable(GL_TEXTURE_2D);
The result is I can see the texture after all, no matter how strange the distribution caused by the texture coordinates. I gathered the following information:
The lib I use for loading image works well;
The setting of the cubemap is from the SuperBibble chapter 9. Almost the same code will work well when I compiling the code of the book.
BTW, does anyone have some suggestion about loading image library? The one I use seems to stop updating for a really long time...
Appending: What I find out now is that if I try to only load one img as all the skybox faces' texture, it will be shown. As long as use a variable to replace a specific value, nothing won't be displayed.
Finally I figure it out. It's because the resolution of the last picture is different from the others.
Realy kind of wasting a lot of time.

OpenGL alpha value makes texture whiter?

I'm trying to load a texture with RGBA values but the alpha values just seem to make the texture more white, not adjust the transparency. I've heard about this problem with 3D scenes, but I'm just using OpenGL for 2D. Is there anyway I can fix this?
I'm initializing OpenGL with
glViewport(0, 0, winWidth, winHeight);
glDisable(GL_TEXTURE_2D);
glEnable(GL_BLEND);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
glDisable(GL_DEPTH_TEST);
glClearColor(0, 0, 0, 0);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
gluOrtho2D(0, winWidth, 0, winHeight); // set origin to bottom left corner
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
glColor3f(1, 1, 1);
Screenshot:
That washed out dotty image should be semitransparent. The black bits are supposed to be completely transparent. As you can see, there's an image behind it that isn't showing through.
The code to generate that texture is rather lengthy, so I'll describe what I did. It's a 40*30*4 array of type unsigned char. Every 4th char is set to 128 (should be 50% transparent, right?).
I then pass it into this function, loads the data into a texture:
void Texture::Load(unsigned char* data, GLenum format) {
glEnable(GL_TEXTURE_2D);
glBindTexture(GL_TEXTURE_2D, _texID);
glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, _w, _h, format, GL_UNSIGNED_BYTE, data);
glDisable(GL_TEXTURE_2D);
}
And...I think I just found the problem. Was initializing the full-sized texture with this code:
glEnable(GL_TEXTURE_2D);
glBindTexture(GL_TEXTURE_2D, _texID);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, tw, th, 0, GL_LUMINANCE, GL_UNSIGNED_BYTE, NULL);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
glTexEnvf(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_REPLACE);
glDisable(GL_TEXTURE_2D);
But I guess glTexImage2D needs to be GL_RGBA too? I can't use two different internal formats? Or at least not ones of different sizes (3 bytes vs 4 bytes)? GL_BGR works fine even when its initialized like this...
In the interest of others, I post my solution here.
The problem was that although my Load function was correct,
glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, _w, _h, GL_RGBA, GL_UNSIGNED_BYTE, data);
I was passing GL_RGB to this function
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, tw, th, 0, GL_LUMINANCE, GL_UNSIGNED_BYTE, NULL);
Which also needs to specify the correct number of bytes (four). From my understanding you can't use a different number of bytes for a SubImage, although I think you can use a different format if it does have the same number of bytes (i.e. mixing GL_RGB and GL_BGA is okay, but not GL_RGB and GL_RGBA).
Are there any overlapping primitives in your scene?
You are aware that you're calling the 3-parameter version of glColor, which sets the alpha to 1.0, right?
It could be helpful if you could post a screenshot, or otherwise describe what happens, say, when you draw two primitives with identical colors and differing alphas. In fact, any code demostrating the problem could help.
Edit:
I'd imagine that using TexImage with GL_RGB (for internalformat, the 3rd parameter) creates a 3-component texture with no alpha or alpha values implicitly initialized to 1, no matter what kind of pixel data you supply.
GL_BGR is not a valid value for this parameter, perhaps it is tricking your implementation into using a full 4-byte internal format? (Or a 2-byte one, as per GL_LUMINANCE_ALPHA) Or do you mean passing GL_BGR to your Texture::Load() function, which should not really be different from passing GL_RGB?
I think this should work, but it assumes the image has an alpha channel. If you try and load an image without an alpha channel you will get an exception or your application might crash. For non-alpha channel images use GL_RGB instead of GL_RGBA on the second parameter, right before setting the GL_UNSIGNED_BYTE.
void Texture::Load(unsigned char* data) {
glEnable(GL_TEXTURE_2D);
glBindTexture(GL_TEXTURE_2D, _texID);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, tw, th, 0, GL_RGBA, GL_UNSIGNED_BYTE, data);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
glTexEnvf(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_REPLACE);
glDisable(GL_TEXTURE_2D);
}