SDL2 Draw scene to texture. SDL2 RenderTexture like SFML - c++

I've been developing a 2D Engine using SFML + ImGui.
The editor is rendered using ImGui and the scene window is a sf::RenderTexture where I draw the GameObjects and then is converted to ImGui::Image to render it in the editor.
Now I need to create a 3D Engine during this year in my Bachelor Degree but using SDL2 + ImGui and I want to recreate what I did with the 2D Engine.
I've managed to render the editor like I did in the 2D Engine using this Example that comes with ImGui.
But I don't know how to create an equivalent of sf::RenderTexture in SDL2, so I can draw the 3D scene there and convert it to ImGui::Image to show it in the editor.
If you can provide code will be better. And if you want me to provide any specific code tell me.

Here's my solution.
SDL2window::SDL2window()
{
SDL_Init(SDL_INIT_VIDEO);
window = SDL_CreateWindow(...);
renderer = SDL_CreateRenderer(window,...);
texture = SDL_CreateTexture(renderer,..., width, height);
// create another texture for your imgui window here
ImTexture = SDL_CreateTexture(renderer,..., tex_width, tex_height);
...
}
void SDL2window::update(Uint32 *framebuffer)
{
...
// update your main window is necessary
SDL_RenderCopyEx(renderer, texture, NULL, NULL, 0, 0, SDL_FLIP_VERTICAL);
// be care of the size of framebuffer
SDL_UpdateTexture(ImTexture, NULL, framebuffer, tex_width * sizeof(Uint32));
ImGui::Render();
ImGui_ImplSDLRenderer_RenderDrawData(ImGui::GetDrawData());
SDL_RenderPresent(renderer);
}
void SDL2window::MyImgui()
{
...
ImGui::Begin("My Texture");
ImGui::Image(ImTexture, ImVec2(tex_width, tex_height));
ImGui::End();
}
Then run MyImgui() in your main loop, it will work.
ImTexture Window
P.S. I like your SFML 2D engine's UI, it looks great.

Think you're looking for something like this:
// Create a render texture
SDL_Texture *target = SDL_CreateTexture(renderer, SDL_PIXELFORMAT_RGBA8888, SDL_TEXTUREACCESS_TARGET, width, height);
// Activate the render texture
SDL_SetRenderTarget(renderer, target);
// (Do your rendering here)
// Disable the render texture
SDL_SetRenderTarget(renderer, NULL);
// (Use the render texture)

Related

Trouble converting from SDL 1.2 to 2.0, Don't know how to initialize surface properly

I'm writing a basic 2D game in C++ with SDL, and just yesterday I converted from 1.2 to 2.0.
I wanted to create a moving camera, and I realized that I should switch from using SDL_Surfaces to SDL_Textures to use some of 2.0's new features. My previous setup involved blitting a bunch of sprite surfaces to a main surface called buffer, and then flipping that to the screen, and it worked well.
So, I created a renderer and assigned it to the window. Then I used SDL_CreateTextureFromSurface to create a texture from my main SDL_Surface. My surface is currently set to nullptr at this point, and I realize that I should probably initialize it to something, but I can't figure out what. Here is the relevant code.
void Game::Init() {
// Initialize SDL.
SDL_Init(SDL_INIT_EVERYTHING);
window = nullptr;
buffer = nullptr;
camera = { player.GetLocation().x - 50, player.GetLocation().y - 50, player.GetLocation().x + 50, player.GetLocation().y + 50 };
fullscreen = false;
screenWidth = 640, screenHeight = 480;
if (fullscreen) {
window = SDL_CreateWindow("Rune", SDL_WINDOWPOS_CENTERED, SDL_WINDOWPOS_CENTERED, screenWidth, screenHeight, SDL_WINDOW_FULLSCREEN_DESKTOP);
renderer = SDL_CreateRenderer(window, -1, SDL_RENDERER_SOFTWARE);
}
else {
// Initialize our window.
window = SDL_CreateWindow("Rune", SDL_WINDOWPOS_CENTERED, SDL_WINDOWPOS_CENTERED, screenWidth, screenHeight, SDL_WINDOW_SHOWN);
renderer = SDL_CreateRenderer(window, -1, SDL_RENDERER_SOFTWARE);
// Assign the buffer surface to the window.
}
texture = SDL_CreateTextureFromSurface(renderer, buffer);
SDL_SetRenderTarget(renderer, texture);
// Run the game loop.
Run();
}
In Run, there's normal working code that involves a game loop, but in the game loop I have a method, Render(), which is causing problems. Here it is.
void Game::Render() {
// Clear the buffer.
SDL_FillRect(buffer, NULL, 0x000000);
// Draw everything to the buffer.
level.Draw(buffer);
if (enemy.IsAlive()) enemy.Draw(buffer);
player.Draw(buffer);
SDL_UpdateTexture(texture, NULL, buffer->pixels, buffer->pitch);
SDL_RenderClear(renderer);
SDL_RenderCopy(renderer, texture, NULL, NULL);
SDL_RenderPresent(renderer);
// Draw buffer to the screen.
//SDL_UpdateWindowSurface(window);
}
Running this code gives me the error that buffer is nullptr, which comes up on the SDL_UpDateTexture line. If I allocate memory for buffer in the Init() function, I get a memory access error. Can anyone offer a solution? Thank you.
After changing all my SDL_Surfaces to Textures, I realized that in my header file I had been calling constructors that relied on the renderer to have already been initialized, which it wasn't. I set them to blank constructors and then initialized them in the Game.cpp file, after the renderer, and now everything works. Thanks Benjamin for your help!

Set the pixel format AND create texture from surface in SDL

I am currently trying some things using pixel manipulation and I would like to set the pixel format of a SDL_Surface. The problem here is that I am loading up an image with the SDL_image.h. So I have to create a texture from a surface like this:
surface = IMG_Load(filePath);
texture = SDL_CreateTextureFromSurface(renderer, surface);
So I can't use the following function, which I would like to, or can I?:
texture = SDL_CreateTexture(renderer, SDL_PIXELFORMAT_ARGB8888, SDL_TEXTUREACCESS_STATIC, 640, 480);
The thing is, that I want to set the SDL_PixelFormat to be able to mess around with the pixels. How can I both do this and create the texture based on a surface?
The SDL2 API provides the function SDL_ConvertSurface:
SDL_Surface* SDL_ConvertSurface(SDL_Surface* src,
const SDL_PixelFormat* fmt,
Uint32 flags)
You should be able to do
surface = IMG_Load(filePath);
// Call SDL_ConvertSurface to set the pixel format of the surface you have created.
texture = SDL_CreateTextureFromSurface(renderer, surface);
Reference: https://wiki.libsdl.org/SDL_ConvertSurface

SDL 2.0 sprite draws skewed (transformed) after OpenGL draws

I'm using SDL to create a window and draw OpenGL in it and after drawing OpenGL I use SDL to show sprites (UI). It worked for me on Windows, OSX and NDK but it doesn't work for me on iOS. This is how I draw the sprite:
I create the window:
int flags = SDL_WINDOW_OPENGL | SDL_WINDOW_SHOWN;
gWindow = SDL_CreateWindow("example", SDL_WINDOWPOS_CENTERED, SDL_WINDOWPOS_CENTERED, 400, 800, flags);
I create the renderer:
gRenderer = SDL_CreateRenderer(gWindow, id, SDL_RENDERER_ACCELERATED | SDL_RENDERER_PRESENTVSYNC);
I load the texture:
SDL_Texture* newTexture = NULL;
SDL_Surface* loadedSurface = SDL_LoadBMP(path.c_str());
newTexture = SDL_CreateTextureFromSurface(gRenderer, loadedSurface);
SDL_FreeSurface(loadedSurface);
That's where I do OpenGL drawing. I load .3ds models, load textures, use blending etc.
And then
I draw the sprite:
dstRect.x = 0.0f;
dstRect.y = 0.0f;
dstRect.w = 128.0f;
dstRect.h = 64.0f;
SDL_RenderCopy(gRenderer, newTexture, NULL , &dstRect);
SDL_RenderPresent(gRenderer);
the result is strange. The sprite shows skewed instead of being drawn in a rectangle.
result http://vvcap.net/db/zHhZwoZa1ng7caeP1BG3.png
What could be the reason of the sprite being transformed like that ? How can I fix this ?
Has anybody had a similar problem ?
It almost looks to me as if the camera transform is rotated slightly around the y-axis and the perspective projection is making it looked skewed.
If the sprite is meant to be draw into the screen-space make sure that you have an orthographic projection enabled before the draw, with the width and height being the size of the physical screen (ie. 480x800).
I didn't find a solution but I just used SDL_RenderCopyEx instead of SDL_RenderCopy and it worked.

SDL - invalid texture error on SDL_DestroyTexture()

I'm making a small "retro-style" 2D platformer game with SDL in C++. I figured that the best way to keep the game at a low resolution, while allowing people with different size monitors to stretch the game window to fit their setup, would be to render everything to a low-res texture and then render that texture to the whole window (with the window size/resolution set by the user).
When I run this setup, the game works exactly as it should and renders fine (in both fullscreen and windowed modes). However, when I use SDL_DestroyTexture() to free my low-res render target texture, the console spits out "ERROR: Invalid texture". I have confirmed that this is where the error occurs using a debugger. Below is the relevant code which creates, uses, and destroys the texture. Why is the texture suddenly invalid when I can use it normally otherwise?
// SDL is initialized
// "proxy" is the texture used for render-to-texture
// it is set to the "logical" low resolution (lxres, lyres) (usually 320x240)
// renderer is an SDL_Renderer* that initializes with no problems
SDL_Texture* proxy = SDL_CreateTexture(renderer, SDL_PIXELFORMAT_RGBA8888,
SDL_TEXTUREACCESS_TARGET, lxres, lyres);
SDL_SetRenderDrawColor(renderer, 0, 0, 0, SDL_ALPHA_OPAQUE);
// game runs fine
while (!quit) {
SDL_SetRenderTarget(renderer, proxy);
render();
SDL_SetRenderTarget(renderer, nullptr);
// stretch the low resolution texture onto the at-least-as-high resolution
// renderer (usually 640x480)
SDL_RenderCopy(renderer, proxy, nullptr, nullptr);
SDL_RenderPresent(renderer);
SDL_RenderClear(renderer);
updateLogic();
}
// Time to quit
SDL_SetRenderTarget(renderer, nullptr);
if (proxy != nullptr)
SDL_DestroyTexture(proxy); // "ERROR: Invalid texture"
// Clean up other resources
// close SDL
This type of error has hapened to me when i destroyed the renderer before destroying the texture it is atached to.

to drag an image by opengl (MFC)

I am using an openGL to drag an image (loaded bitmap) and wondering if there some methods/function to transform the image on the screen.
so far i have done this code to load an image:
void CDisplayControlPanelView::OnDraw(CDC* /*pDC*/)
{
CDisplayControlPanelDoc* pDoc = GetDocument();
ASSERT_VALID(pDoc);
if(!pDoc)
return;
wglMakeCurrent(m_hDC , m_hRC);
RenderScene();
SwapBuffers(m_hDC);
wglMakeCurrent(m_hDC,NULL);
}
void CDisplayControlPanelView::RenderScene()
{
AUX_RGBImageRec* pRGBImage;
glPixelStorei(GL_UNPACK_ALIGNMENT, 1);
glClearColor(0.0, 0.0, 0.0, 0.0);
glClear(GL_COLOR_BUFFER_BIT);
pRGBImage = auxDIBImageLoadA("D:\\map.bmp");
glDrawPixels(pRGBImage->sizeX, pRGBImage->sizeY, GL_RGB, GL_UNSIGNED_BYTE, pRGBImage->data);
glFlush();
}
Use glTranslate. There are many other ways but this is the most simple. Check out some tutorials if you are new to OpenGL, it could help.
The first thing you must understand is, that OpenGL is not a scene graph. It's a drawing API, very much like Windows GDI. The function glDrawPixels is not very unlike a BitBlt from a MemDC.
Anyway: You shouldn't use glDrawPixels. It's slow and deprecated. The way to draw images in OpenGL is uploading the image into a texture and drawing a textured quad. The quad you can freely move around as you like.