I'm making a small "retro-style" 2D platformer game with SDL in C++. I figured that the best way to keep the game at a low resolution, while allowing people with different size monitors to stretch the game window to fit their setup, would be to render everything to a low-res texture and then render that texture to the whole window (with the window size/resolution set by the user).
When I run this setup, the game works exactly as it should and renders fine (in both fullscreen and windowed modes). However, when I use SDL_DestroyTexture() to free my low-res render target texture, the console spits out "ERROR: Invalid texture". I have confirmed that this is where the error occurs using a debugger. Below is the relevant code which creates, uses, and destroys the texture. Why is the texture suddenly invalid when I can use it normally otherwise?
// SDL is initialized
// "proxy" is the texture used for render-to-texture
// it is set to the "logical" low resolution (lxres, lyres) (usually 320x240)
// renderer is an SDL_Renderer* that initializes with no problems
SDL_Texture* proxy = SDL_CreateTexture(renderer, SDL_PIXELFORMAT_RGBA8888,
SDL_TEXTUREACCESS_TARGET, lxres, lyres);
SDL_SetRenderDrawColor(renderer, 0, 0, 0, SDL_ALPHA_OPAQUE);
// game runs fine
while (!quit) {
SDL_SetRenderTarget(renderer, proxy);
render();
SDL_SetRenderTarget(renderer, nullptr);
// stretch the low resolution texture onto the at-least-as-high resolution
// renderer (usually 640x480)
SDL_RenderCopy(renderer, proxy, nullptr, nullptr);
SDL_RenderPresent(renderer);
SDL_RenderClear(renderer);
updateLogic();
}
// Time to quit
SDL_SetRenderTarget(renderer, nullptr);
if (proxy != nullptr)
SDL_DestroyTexture(proxy); // "ERROR: Invalid texture"
// Clean up other resources
// close SDL
This type of error has hapened to me when i destroyed the renderer before destroying the texture it is atached to.
Related
I've experienced a problem when trying to declare an identifier. The main part is textureBackground.loadFromFile("graphics/background.png");
where textureBackground is the one being underlined
I've tried adding parentheses, changing uppercases, lower cases, check file locations, etc.
int main()
{
//Create a video mode object
VideoMode vm(1920, 1080);
// Create and open a window for game
RenderWindow window(vm, "Scarful!!!", Style::Fullscreen);
while (window.isOpen())
// Texture for graphic on cpu
Texture textureBackground;
// Load graphic into texture
textureBackground.loadFromFile("graphics/background.png");
// Make Sprite
Sprite spriteBackground;
// Attach texture to sprite
spriteBackground.setTexture(textureBackground);
// Set spritebackground to cover screen
spriteBackground.setPosition(0, 0);
{
/* Handle player input */
if (Keyboard::isKeyPressed(Keyboard::Escape))
{
window.close();
}
//Update Scene
//Draw Scene
window.clear();
//Draw Game Scene
window.draw(spriteBackground);
//Show everything we drew
window.display();
}
return 0;
}
Here,
while (window.isOpen())
// Texture for graphic on cpu
Texture textureBackground;
// Load graphic into texture
textureBackground.loadFromFile("graphics/background.png");
You are trying to do this:
while (window.isOpen()) {
// Variable goes out of scope outside of the loop...
Texture textureBackground;
}
textureBackground.loadFromFile("graphics/background.png");
// ^^^^^^^^^^^^^^^^^ is not available anymore...
And since textureBackground is out-of-scope you cannot modify it anymore... I suggest you wanted...
// Texture for graphic on cpu
Texture textureBackground;
// Load graphic into texture
textureBackground.loadFromFile("graphics/background.png");
// Make Sprite
Sprite spriteBackground;
// Attach texture to sprite
spriteBackground.setTexture(textureBackground);
// Set spritebackground to cover screen
spriteBackground.setPosition(0, 0);
while (window.isOpen()) {
// Other code goes here...
}
I've been developing a 2D Engine using SFML + ImGui.
The editor is rendered using ImGui and the scene window is a sf::RenderTexture where I draw the GameObjects and then is converted to ImGui::Image to render it in the editor.
Now I need to create a 3D Engine during this year in my Bachelor Degree but using SDL2 + ImGui and I want to recreate what I did with the 2D Engine.
I've managed to render the editor like I did in the 2D Engine using this Example that comes with ImGui.
But I don't know how to create an equivalent of sf::RenderTexture in SDL2, so I can draw the 3D scene there and convert it to ImGui::Image to show it in the editor.
If you can provide code will be better. And if you want me to provide any specific code tell me.
Here's my solution.
SDL2window::SDL2window()
{
SDL_Init(SDL_INIT_VIDEO);
window = SDL_CreateWindow(...);
renderer = SDL_CreateRenderer(window,...);
texture = SDL_CreateTexture(renderer,..., width, height);
// create another texture for your imgui window here
ImTexture = SDL_CreateTexture(renderer,..., tex_width, tex_height);
...
}
void SDL2window::update(Uint32 *framebuffer)
{
...
// update your main window is necessary
SDL_RenderCopyEx(renderer, texture, NULL, NULL, 0, 0, SDL_FLIP_VERTICAL);
// be care of the size of framebuffer
SDL_UpdateTexture(ImTexture, NULL, framebuffer, tex_width * sizeof(Uint32));
ImGui::Render();
ImGui_ImplSDLRenderer_RenderDrawData(ImGui::GetDrawData());
SDL_RenderPresent(renderer);
}
void SDL2window::MyImgui()
{
...
ImGui::Begin("My Texture");
ImGui::Image(ImTexture, ImVec2(tex_width, tex_height));
ImGui::End();
}
Then run MyImgui() in your main loop, it will work.
ImTexture Window
P.S. I like your SFML 2D engine's UI, it looks great.
Think you're looking for something like this:
// Create a render texture
SDL_Texture *target = SDL_CreateTexture(renderer, SDL_PIXELFORMAT_RGBA8888, SDL_TEXTUREACCESS_TARGET, width, height);
// Activate the render texture
SDL_SetRenderTarget(renderer, target);
// (Do your rendering here)
// Disable the render texture
SDL_SetRenderTarget(renderer, NULL);
// (Use the render texture)
I'm writing a basic 2D game in C++ with SDL, and just yesterday I converted from 1.2 to 2.0.
I wanted to create a moving camera, and I realized that I should switch from using SDL_Surfaces to SDL_Textures to use some of 2.0's new features. My previous setup involved blitting a bunch of sprite surfaces to a main surface called buffer, and then flipping that to the screen, and it worked well.
So, I created a renderer and assigned it to the window. Then I used SDL_CreateTextureFromSurface to create a texture from my main SDL_Surface. My surface is currently set to nullptr at this point, and I realize that I should probably initialize it to something, but I can't figure out what. Here is the relevant code.
void Game::Init() {
// Initialize SDL.
SDL_Init(SDL_INIT_EVERYTHING);
window = nullptr;
buffer = nullptr;
camera = { player.GetLocation().x - 50, player.GetLocation().y - 50, player.GetLocation().x + 50, player.GetLocation().y + 50 };
fullscreen = false;
screenWidth = 640, screenHeight = 480;
if (fullscreen) {
window = SDL_CreateWindow("Rune", SDL_WINDOWPOS_CENTERED, SDL_WINDOWPOS_CENTERED, screenWidth, screenHeight, SDL_WINDOW_FULLSCREEN_DESKTOP);
renderer = SDL_CreateRenderer(window, -1, SDL_RENDERER_SOFTWARE);
}
else {
// Initialize our window.
window = SDL_CreateWindow("Rune", SDL_WINDOWPOS_CENTERED, SDL_WINDOWPOS_CENTERED, screenWidth, screenHeight, SDL_WINDOW_SHOWN);
renderer = SDL_CreateRenderer(window, -1, SDL_RENDERER_SOFTWARE);
// Assign the buffer surface to the window.
}
texture = SDL_CreateTextureFromSurface(renderer, buffer);
SDL_SetRenderTarget(renderer, texture);
// Run the game loop.
Run();
}
In Run, there's normal working code that involves a game loop, but in the game loop I have a method, Render(), which is causing problems. Here it is.
void Game::Render() {
// Clear the buffer.
SDL_FillRect(buffer, NULL, 0x000000);
// Draw everything to the buffer.
level.Draw(buffer);
if (enemy.IsAlive()) enemy.Draw(buffer);
player.Draw(buffer);
SDL_UpdateTexture(texture, NULL, buffer->pixels, buffer->pitch);
SDL_RenderClear(renderer);
SDL_RenderCopy(renderer, texture, NULL, NULL);
SDL_RenderPresent(renderer);
// Draw buffer to the screen.
//SDL_UpdateWindowSurface(window);
}
Running this code gives me the error that buffer is nullptr, which comes up on the SDL_UpDateTexture line. If I allocate memory for buffer in the Init() function, I get a memory access error. Can anyone offer a solution? Thank you.
After changing all my SDL_Surfaces to Textures, I realized that in my header file I had been calling constructors that relied on the renderer to have already been initialized, which it wasn't. I set them to blank constructors and then initialized them in the Game.cpp file, after the renderer, and now everything works. Thanks Benjamin for your help!
I want to read the pixels from the back buffer. But all i get so far is a black screen (the clear color).
The thing is, is that i don't need a glut window to draw to. Once i have the pixel information, then i pass that to another program which will draw the image for me.
My init function looks like this:
// No main function, so no real argv argc
char fakeParam[] = "nothing";
char *fakeargv[] = { fakeParam, NULL };
int fakeargc = 1;
glutInit( &fakeargc, fakeargv );
GLenum err = glewInit();
if (GLEW_OK != err)
{
MessageBoxA(NULL, "Failed to initialize OpenGL", "ERROR", NULL);
}
else
{
glEnable(GL_TEXTURE_2D);
glEnable(GL_DEPTH_TEST);
// Not sure if this call is needed since i don't use a glut window to render too..
glutInitDisplayMode(GLUT_RGB | GLUT_DOUBLE | GLUT_DEPTH);
}
Then in my render function i do:
void DisplayFunc(void)
{
/* Clear the buffer, clear the matrix */
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glLoadIdentity();
// TEAPOT
glTranslatef(0.0f, 0.0f, -5.0f); // Translate back 5 units
glRotatef(rotation_degree, 1.0f, 1.0f, 0.0f); // Rotate according to our rotation_degree value
glFrontFace(GL_CW);
glutSolidTeapot(1.0f); // Render a teapot
glFrontFace(GL_CCW);
glReadBuffer(GL_BACK);
glReadPixels(0, 0, (GLsizei)1024, (GLsizei)768, GL_RGB, GL_UNSIGNED_BYTE, pixels);
int r = glGetError();
}
This is basically all i do. At the end of the last function is where i'm trying to read all the pixels. But the output is just a black image. glGetError() doesn't give any errors.
Anyone any idea what the problem could be...???
I want to read the pixels from the back buffer. But all i get so far is a black screen (the clear color).
The thing is, is that i don't need a glut window to draw to. Once i have the pixel information, then i pass that to another program which will draw the image for me.
It doesn't work like that. The backbuffer is not some kind of off-screen rendering area, it's part of on-screen window. Actually the whole doublebuffer concept only makes sense for on-screen windows. Each pixel of a double buffered window has two color values, but only one depth, stencil, etc.; upon buffer swap just the pointer to the back and front pixel plane are exchanged. But because we're still talking about a window, when rasterizing all fragments go through the pixel ownership test, i.e. are checked for, if they are actually visible on screen. If not, they're not rendered.
But your problems go further: You don't even create a window, so you don't have an OpenGL context at all. Your calling of OpenGL commands has no effect whatsoever. glReadPixels doesn't return you anything, because there's nothing to read from.
The bad news is, that the only way to get a context with GLUT is, by creating a window. The good news is, you don't have to use GLUT. People, why don't you get this: GLUT is not part of OpenGL, it's a quick and dirty framework for writing small tutorials, nothing more.
What you want is either:
not a window, but a PBuffer, i.e. a off screen drawable, that doesn't got through pixel ownership tests.
or
A hidden window with a OpenGL context created on it, and in this context a Frame Buffer Object for an off-screen rendering target.
Try calling glFlush before glReadPixels.
Also, where do you set the size of your window?
Im trying to show some textures in my program, and I have this code thats used to load bitmaps into openGL textures:
void LoadGLTextures()
{
// Bitmap handle and structure
HBITMAP hBMP;
BITMAP BMP;
// Generate list of textures from resources
byte Texture[] = {IDB_FONT, IDB_SKIN, IDB_PIANO};
glGenTextures(sizeof(Texture), &texture[0]);
// Iterate through texture list and load bitmaps
for (int loop=0; loop<sizeof(Texture); loop++)
{
hBMP = (HBITMAP)LoadImage(GetModuleHandle(NULL), MAKEINTRESOURCE(Texture[loop]),
IMAGE_BITMAP, 0, 0, LR_CREATEDIBSECTION);
if (hBMP)
{
GetObject(hBMP,sizeof(BMP), &BMP);
glPixelStorei(GL_UNPACK_ALIGNMENT,4);
glBindTexture(GL_TEXTURE_2D, texture[loop]);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_LINEAR_MIPMAP_LINEAR);
// Generate Mipmapped Texture (3 Bytes, Width, Height And Data From The BMP)
gluBuild2DMipmaps(GL_TEXTURE_2D, 3, BMP.bmWidth, BMP.bmHeight, GL_BGR_EXT, GL_UNSIGNED_BYTE, BMP.bmBits);
DeleteObject(hBMP);
}
}
And while my background skin loads, and gets drawn correctly, the other (piano) texture doesn't get drawn. Im sure the drawing code is correct because when i swap which texture is used (from the piano to the background texture, in this case), the other texture gets drawn. So i think the bitmap isn't being loaded correctly. But im not sure why? Is there something glaringly obvious i have overlooked?
The bitmap is 128*256 and 24 bit colour.
If you need any of the other code please let me know.
edit - If anyone knows of any librarys that would do what I require, please let me know
It might not be working right because it's deprecated.
From http://www.opengl.org/wiki/Common_Mistakes:
gluBuild2DMipmaps - Never use this. Use either GL_GENERATE_MIPMAP (requires GL 1.4) or the glGenerateMipmap function (requires GL 3.0).
Edit: Also, you probably need to call glEnable(GL_TEXTURE_2D) for EACH texture unit, i.e. inside the loop where you call glBindTexture.