SDL2 does not draw image using texture - c++

I have been attempting to get a simple SDL2 program up to display an image and then exit. I have this code:
/* compile with `gcc -lSDL2 -o main main.c` */
#include <SDL2/SDL.h>
#include <SDL2/SDL_video.h>
#include <SDL2/SDL_render.h>
#include <SDL2/SDL_surface.h>
#include <SDL2/SDL_timer.h>
int main(void){
SDL_Init(SDL_INIT_VIDEO);
SDL_Window * w = SDL_CreateWindow("Hi", 0, 0, 640, 480, 0);
SDL_Renderer * r = SDL_CreateRenderer(w, -1, 0);
SDL_Surface * s = SDL_LoadBMP("../test.bmp");
SDL_Texture * t = SDL_CreateTextureFromSurface(r, s);
SDL_FreeSurface(s);
SDL_RenderClear(r);
SDL_RenderCopy(r, t, 0, 0);
SDL_RenderPresent(r);
SDL_Delay(2000);
SDL_DestroyTexture(t);
SDL_DestroyRenderer(r);
SDL_DestroyWindow(w);
SDL_Quit();
}
I am aware that I have omitted the normal checks that each function succeeds - they all do succeed, they were removed for ease of reading. I am also aware I have used 0 rather than null pointers or the correct enum values, this also is not the cause of the issue (as the same issue occurs when I correctly structure the program, this was a quick test case drawn up to test the simplest case)
The intention is that a window appear and shows the image (which is definitely at that directory), wait for a couple of seconds and exit. The result, however, is that the window appears correctly but the window is filled with black.
An extra note SDL_ShowSimpleMessageBox() appears to work correctly. I don't know how this relates to the rest of the framework though.

I'll just clear this, you wanted to make a texture, do it directly to ease control, plus this gives you better control over the image, try using this code, fully tested, and working, all you wanted was for the window to show the image and close within 2 seconds right?. If the image doesn't load then it's your image's location.
/* compile with `gcc -lSDL2 -o main main.c` */
#include <SDL.h>
#include <SDL_image.h>
#include <iostream> //I included it since I used cout
int main(int argc, char* argv[]){
bool off = false;
int time;
SDL_Init(SDL_INIT_VIDEO);
SDL_Window * w = SDL_CreateWindow("Hi", 0, 0, 640, 480, SDL_WINDOW_SHOWN);
SDL_Renderer * r = SDL_CreateRenderer(w, -1, SDL_RENDERER_ACCELERATED);
SDL_Texture * s = NULL;
s = IMG_LoadTexture(r, "../test.bmp"); // LOADS A TEXTURE DIRECTLY FROM THE IMAGE
if (s == NULL)
{
cout << "FAILED TO FIND THE IMAGE" << endl; //we did this to check if IMG_LoadTexture found the image, if it showed this message in the cmd window (the black system-like one) then it means that the image can't be found
}
SDL_Rect s_rect; // CREATES THE IMAGE'S SPECS
s_rect.x = 100; // just like the window, the x and y values determine it's displacement from the origin which is the top left of your window
s_rect.y = 100;
s_rect.w = 640; //width of the texture
s_rect.h = 480; // height of the texture
time = SDL_GetTicks(); //Gets the current time
while (time + 2000 < SDL_GetTicks()) //adds 2 seconds to the past time you got and waits for the present time to surpass that
{
SDL_RenderClear(r);
SDL_RenderCopy(r, s, NULL, &s_rect); // THE NULL IS THE AREA YOU COULD USE TO CROP THE IMAGE
SDL_RenderPresent(r);
}
SDL_DestroyTexture(s);
SDL_DestroyRenderer(r);
SDL_DestroyWindow(w);
return 0; //returns 0, closes the program.
}
if you wanted to see a close button on the window and want it to take effect then create an event then add it to the while area to check if it's equal to SDL_Quit();, I didn't include it since you wanted it to immediately close within 2 seconds, thus, rendering the close button useless.
HAPPY CODING :D

When using SDL_RENDERER_SOFTWARE for the renderer flags this worked. Also it worked on a different machine. Guess there must be something screwed up with my configuration, although I'm not sure what it is because I'm still getting no errors shown. Ah well, mark as solved.

I believe this to be (not 100% sure, but fairly sure), due to this line of code:
SDL_Renderer * r = SDL_CreateRenderer(w, -1, 0);
According to the SDL wiki article SDL_CreateRenderer, the parameters required for the arguments that you are passing in are as follows:
SDL_Window* window
int index
Uint32 flags
You are passing in the pointer to the window correctly, as well as the index correctly, but the lack of a flag signifies to SDL that SDL should use the default renderer.
Most systems have a default setting for applications for which renderer should be used, and this can be modified on a application by application basis. If no default setting is provided for a specific application, the render look up immediately checks the default render settings list. The SDL wiki briefly refers to this list by the following line at the bottom of the remarks section:
"Note that providing no flags gives priority to available SDL_RENDERER_ACCELERATED renderers."
What's not explained here in the wiki is that the "renderers" the wiki is referring to comes from the default renderer list.
This leads me to believe that you have either changed a setting somewhere in the course of history of your computer, or elsewhere in you visual studio settings that is resulting in no list to be found. Therefore you must explicitly inform SDL which renderer to use because of your machine settings. Otherwise using an argument of 0 should work just fine. In the end this doesn't hurt as it's better to be explicit in your code rather than implicit if at all possible.
(That said, all of my deductions are based off of the fact that I am assuming that everything you said that works, works. If this is not true, then the issue could be because of a vast variety of reasons due to the lack of error checking.)

Related

SDL2 access violation error whilst initializing window

We've just started learning how to work with SDL2 and whilst following a tutorial we ran into this access violation error. We're trying to initialize a green window. We're familiar with C and C#, but we haven't got much experience with C++.
Whilst messing around we determined that one of the last four lines seemed to contain the problem. We hope someone can help.
Exception thrown at 0x00007FF801C69BB9 (SDL2.dll) in PWS.exe: 0xC0000005: Access violation writing location 0x00000000000000FF.
#include "SDL.h"
#undef main
int main(int argc, char* argv[])
{
SDL_Init(SDL_INIT_EVERYTHING);
SDL_Window* window = SDL_CreateWindow("title", SDL_WINDOWPOS_CENTERED, SDL_WINDOWPOS_CENTERED, 600, 400, SDL_WINDOW_SHOWN);
SDL_Renderer* renderer = SDL_CreateRenderer(window, -1, 0);
SDL_GetRenderDrawColor(renderer, 0, (uint8_t*)255, 0, (uint8_t*)255);
SDL_RenderClear(renderer);
SDL_RenderPresent(renderer);
SDL_Delay(3000);
return 0;
}
The crash you are seeing comes from the following line:
SDL_GetRenderDrawColor(renderer, 0, (uint8_t*)255, 0, (uint8_t*)255);
The fact that you had to cast the color value to a pointer is an immediate clue that something wrong is going on here.
SDL_GetRenderDrawColor() (notice the Get) writes the current value of the render color to the addresses passed as arguments, so you are asking SDL to write the red and blue components of the current color to the 0 memory address, and the green and alpha components to the 255 memory address. These will result in an Access Violation, since 0 and 255 are not valid memory addresses to write at.
255 in 64-bit hexadecimal is 0x00000000000000FF which happens to be exactly what the error is telling you the invalid writing location is. This is why I can be so confident that this specific line is the culprit.
N.B. SDL is probably skipping over the writes to 0, as 0 meaning nowhere is a common convention.
You probably meant to use SDL_SetRenderDrawColor() instead:
SDL_SetRenderDrawColor(renderer, 0, 255, 0, 255);
However, there's at least one other major problem in your program: You are not setting up your main() function the way SDL expects. If you HAVE to circumvent SDL taking over the main function, you should use SDL_SetMainReady()

SDL_Texture renders black after resize unless it is redrawn

I've got a bit of a nasty bug for you folks. (Yes, it's probably my bug and not SDL's.) I have been in the process of writing a modern C++ wrapper for SDL and everything appears to be working as intended. However, my Texture class has a strange bug: if it is redrawn after a resize, it looks fine, but if it is not, it becomes entirely black. Here is what that looks like:
Before the resize
After the resize
I can't exactly post just one part of the code here, so here is the entire folder (hosted on GitLab): SDL wrapper
Here is a small program that reproduces the error using this library:
#include "sdl_wrapper/context.hh"
#include "sdl_wrapper/video/context.hh"
#include "sdl_wrapper/render/renderer.hh"
#include "sdl_wrapper/render/texture.hh"
#include "sdl_wrapper/colors.hh"
#include "SDL2/SDL_events.h"
int main()
{
sdl::Context sdlContext;
sdl::video::Context videoContext = sdlContext.initVideo();
sdl::video::Window window = videoContext.createWindow("test", 0, 0, 800, 600).resizable().build();
int width = 800;
int height = 600;
sdl::render::Renderer renderer = window.createRenderer().targetTexture().build();
sdl::render::Texture example(renderer, SDL_PIXELFORMAT_RGBA8888, SDL_TEXTUREACCESS_TARGET, 400, 300);
renderer.setTarget(example);
renderer.setDrawColor(sdl::colors::Blue);
renderer.clear();
renderer.resetTarget();
bool run = true;
while (run)
{
SDL_Event e;
while (SDL_PollEvent(&e) != 0)
{
if (e.type == SDL_QUIT)
{
run = false;
break;
}
}
renderer.setDrawColor({0x44, 0x44, 0x44, 0xff});
renderer.clear();
renderer.copy(example, std::nullopt, {{width / 4, height / 4, example.getWidth(), example.getHeight()}});
renderer.present();
}
}
To reproduce, simply run this program and resize the window. There will be a blue square that becomes black after the resize.
I would greatly appreciate it if someone could point me in the right direction here. I would really like to avoid redrawing on every resize (feel free to argue with me on that point if I am misguided).
SDL doesn't promise to keep target textures data. There are cases, especially with d3d or mobile, where data is lost due to some big state change. Changing window size may sound not that big, but on some hardware/driver configurations is causes problems, I suppose that's the reason why SDL detects resize and drops all renderer data. You get SDL_RENDER_TARGETS_RESET event when you need to update your render textures.
That shouldn't happen with e.g. opengl renderer implementation (that may sound great but reasons behind it are not so great); on windows, SDL2 defaults to direct3d, which could be modified by issuing SDL_SetHint or setting envvars.

SDL desktop resolution detection in Linux [duplicate]

This question already has an answer here:
How can I get the screen resolution using SDL2?
(1 answer)
Closed 6 years ago.
I have got some reports that for some Linux users, especially those who use SteamOS, my game opens in wrong resolution. The game tries to detect the current desktop resolution and create a borderless fullscreen window by using that resolution.
For example, the resolution of the SteamOS is usually 1920x1080, but the SDL reports it to be something like 4096x2160! Thus when the game starts, players see only lower-left portion of the game area.
My function for detecting screen resolution is the following:
bool View::checkDisplaySize() {
int display_count = 0;
int display_index = 0;
int mode_index = 0;
SDL_DisplayMode mode = { SDL_PIXELFORMAT_UNKNOWN, 0, 0, 0, 0 };
if ((display_count = SDL_GetNumVideoDisplays()) < 1) {
printf("SDL_GetNumVideoDisplays returned: %i", display_count);
return false;
}
if (SDL_GetDisplayMode(display_index, mode_index, &mode) != 0) {
printf("SDL_GetDisplayMode failed: %s", SDL_GetError());
return false;
}
m_display.w = mode.w;
m_display.h = mode.h;
return true;
}
I then use the information which is stored in m_display struct to enter fullscreen. Window creation and going to fullscreen are in separate functions, because players who use some other Linux distro than SteamOS have also an option to enter to windowed mode during the game:
window = SDL_CreateWindow("Game", 0, 0, m_display.w, m_display.h, window_flags);
...
SDL_SetWindowBordered(window, SDL_FALSE);
SDL_SetWindowPosition(window, 0, 0);
SDL_SetWindowSize(window, m_display.w, m_display.h);
SDL_SetWindowFullscreen(window, SDL_WINDOW_FULLSCREEN_DESKTOP);
For me this has worked without problems with all the Linux computers I have tested. I haven't been able to reproduce the problem in my own test environment.
My questions are:
Is this a problem in SDL implementation in Linux or am I doing something wrong?
And, if indeed I am to blame here:
Is this correct way to query the screen resolution?
If not, should I use some other method to query resolution (which is more reliable)?
The documentation for SDL_GetDisplayMode has an enlightening remark:
The display modes are sorted in this priority:
width -> largest to smallest
height -> largest to smallest
...
That means that what you are actually querying seems to be the largest supported resolution for the display, rather than the actual resolution.
You'll probably want to use SDL_GetCurrentDisplayMode or SDL_GetDesktopDisplayMode to get the currently active display mode for the display.

How to efficiently render a small sprite in Direct3D / C++ on a large Window (DWM)?

I'm implementing a custom cursor in DirectX/C++ that is drawn on a transparent window on top of the desktop.
I have stripped it down to a basic example. The magic of executing Direct3D on the DWM is based on this article on Code Project
The problem is that when using a very big window (e.g. 2560x1440) as a base for the DirectX rendering, it will give up to 40% GPU Load according to GPU-Z. Even if the only thing I am displaying is a static 128x128 sprite, or nothing at all. If I use an area like 256x256, the GPU Load is around 1-3%.
Basically this loop would make the GPU go crazy on a big window while it's smooth sailing on a small window:
while(true) {
g_pD3DDevice->PresentEx(NULL, NULL, NULL, NULL, NULL);
Sleep(10);
}
So it seems like it re-renders the whole screen whether anything changes or not, am I right? Can I tell Direct3D to only re-render specific parts that needs to be updated?
EDIT:
I have found a way to tell Direct3D to render a specific part by providing RGNDATA Dirty region information to PresentEx. It is now 1% GPU Load instead of 20-40%.
std::vector<RECT> dirtyRects;
//Fill dirtyRects with previous and new cursor boundaries
DWORD size = dirtyRects.size() * sizeof(RECT)+sizeof(RGNDATAHEADER);
RGNDATA *rgndata = NULL;
rgndata = (RGNDATA *)HeapAlloc(GetProcessHeap(), 0, size);
RECT* pRectInitial = (RECT*)rgndata->Buffer;
RECT rectBounding = dirtyRects[0];
for (int i = 0; i < dirtyRects.size(); i++)
{
RECT rectCurrent = dirtyRects[i];
rectBounding.left = min(rectBounding.left, rectCurrent.left);
rectBounding.right = max(rectBounding.right, rectCurrent.right);
rectBounding.top = min(rectBounding.top, rectCurrent.top);
rectBounding.bottom = max(rectBounding.bottom, rectCurrent.bottom);
*pRectInitial = dirtyRects[i];
pRectInitial++;
}
//preparing rgndata header
RGNDATAHEADER header;
header.dwSize = sizeof(RGNDATAHEADER);
header.iType = RDH_RECTANGLES;
header.nCount = dirtyRects.size();
header.nRgnSize = dirtyRects.size() * sizeof(RECT);
header.rcBound.left = rectBounding.left;
header.rcBound.top = rectBounding.top;
header.rcBound.right = rectBounding.right;
header.rcBound.bottom = rectBounding.bottom;
rgndata->rdh = header;
// Update display
g_pD3DDevice->PresentEx(NULL, NULL, NULL, rgndata, 0);
But it's something I do not understand. It will only give 1% GPU Load if I add the following
SetLayeredWindowAttributes(hWnd, 0, 180, LWA_ALPHA);
I want it transparent anyway so it's good, but instead I get some weird tearing effects after a while. It is more noticeable the faster I move the cursor. What does that come from? It looks like image provided. I am sure I have set the dirty rects perfectly accurate.
The above tearing seem to differ from computer to computer.

GLFW get screen height/width?

Playing around with OpenGL for a while, using the freeglut library, I decided that I will use GLFW for my next training project instead, since I was told that GLUT was only designed for learning purposes and should not be used professionally. I had no problems with linking the lib to my NetBeans project and it compiles just fine, using mingw32 4.6.2.
However, I am running into difficulties trying to position the window at the center of the screen.
Under freeglut, I previously used:
glutInitWindowPosition (
(glutGet(GLUT_SCREEN_WIDTH)-RES_X) / 2,
(glutGet(GLUT_SCREEN_HEIGHT)-RES_Y) / 2
);
I can't find any glfw function that would return the screen size or width. Is such a function simply not implemented?
How about glfwGetDesktopMode, I think this is what you want.
Example:
GLFWvidmode return_struct;
glfwGetDesktopMode( &return_struct );
int height = return_struct.Height;
For GLFW they use glfwGetVideoMode, which has a different call but the return structure can be used in the same way.
first you need two variables to store your width and height.
int width, height;
then as described on page 14 of the reference.
glfwSetWindowPos(width / 2, height / 2);
and as a bonus you can then call
glfwGetWindowSize(&width, &height);
this a void function and does not return any value however it will update the two previously declared variables.. so place it in the mainloop or the window reshape callback function.
you can verify this in the official manual here on page 15.
This might help somebody...
void Window::CenterTheWindow(){
GLFWmonitor* monitor = glfwGetPrimaryMonitor();
const GLFWvidmode* mode = glfwGetVideoMode(monitor);
glfwSetWindowPos(m_Window, (mode->width - m_Width) / 2, (mode->height - m_Height) / 2);
}
m_Width and m_Height are variables that have the width and the height of the window.
Reference: http://www.glfw.org/docs/latest/monitor.html