I've been having trouble with a game I've been working on, where once I added music it started segfaulting in my frequently-called texture-loading code, between 5-30 secs after it started playing. The best I could come up with was that it is some sort of memory corruption. After a good week of unsuccessfully trying to debug it (trying things like GFlags pageheap), I managed to cut it down to the following code, which still exhibits the problem.
Sometimes this segfaults with the callstack going through SDL2_mixer.dll, but mostly it occurs in the SDL_CreateTextureFromSurface call, due to the renderer being in a bad state. numTextures gets to between 15000-40000 on my machine (Windows 10 x64, with program compiled for x86).
My gut tells me that there's an issue in my environment or code, rather than an issue in SDL itself, but I'm at a loss. Any help or insights would be greatly appreciated.
#include <SDL_image.h>
#include <SDL_mixer.h>
#include <cassert>
int main(int argc, char* argv[])
{
assert(SDL_Init(SDL_INIT_EVERYTHING) == 0);
SDL_Window * pWindow_ = SDL_CreateWindow(
"", SDL_WINDOWPOS_UNDEFINED, SDL_WINDOWPOS_UNDEFINED, 640, 480, 0x0);
assert(pWindow_ != nullptr);
SDL_Renderer * pRenderer_ = SDL_CreateRenderer(pWindow_, -1, 0);
assert(pRenderer_ != nullptr);
assert(Mix_OpenAudio(44100, MIX_DEFAULT_FORMAT, 2, 512) == 0);
Mix_Music * pMusic = Mix_LoadMUS("sounds/tranquility.wav");
assert(pMusic != nullptr);
assert(Mix_PlayMusic(pMusic, -1) == 0);
SDL_Surface * pSurface = IMG_Load("images/caution.png");
assert(pSurface != nullptr);
SDL_Texture * pTexture = SDL_CreateTextureFromSurface(pRenderer_, pSurface);
assert(pTexture != nullptr);
int numTextures = 0;
while (true)
{
numTextures += 10;
assert(pTexture != nullptr);
SDL_DestroyTexture(pTexture);
pTexture = SDL_CreateTextureFromSurface(pRenderer_, pSurface);
assert(pTexture != nullptr);
}
}
The solution turned out to be to update to the latest version of SDL (2.0.3 -> 2.0.5).
I started developing the project in question with an engine code base which I upgraded from SDL 1.2 to 2.0 about 2 years ago, when the latest version was 2.0.3.
When I recently added sound and music, I took the latest SDL_mixer, and didn't think to update SDL to the latest 2.0.5.
After getting the latest development and runtime libraries for SDL (and SDL_image and SDL_mixer for good measure), the problem disappeared.
I'm not entirely satisfied with this. I'm quite surprised that the newer SDL_mixer linked successfully with an older SDL, if they were not compatible. In addition, I can't find any resources online that suggest any compatibility issues. Therefore, I have an uneasy feeling there may have been something else going on, which was resolved incidentally by the upgrade.
Related
everyone.
I wrote a small 3D scene two years ago, it's only 1700 lines of source code (excluding .h files). Now coming back to GitHub and running my app I found out pretty interestion bug in debug mode.
Debbuger throws me an exception when calling CreateBuffer for vertex buffer:
auto result = device->CreateBuffer(&vertex_buffer_desc, &vertex_data, &vertex_buffer);
if(FAILED(result))
return false;
Basically, debugger says (d3d11) device is nullptr which can't be the case, because running in non-debug mode, everything works fine. But when I define UINT create_device_flag = D3D11_CREATE_DEVICE_DEBUG; before creating device I have this exception thrown: read access violation. device is nullptr. A few days later, I still cannot find out what's is wrong, because the order that pointers are defined in is correct.
Here's Main.cpp:
#include <StdAfx.h>
#include <Window.h>
#include <FPSCamera.h>
#include <DirectInput8.h>
#include <D3D11Renderer.h>
#include <Terrain.h>
#include <TerrainShader.h>
using namespace bm;
int __stdcall WinMain(HINSTANCE, HINSTANCE, char*, int)
{
auto resource_directory_name = L"..\\..\\..\\Resource\\"s;
auto terrain_name = L"terrain"s;
auto dds_file_extension = L".dds"s;
auto hlsl_file_extension = L".hlsl"s;
std::wstring resources[] = {resource_directory_name + L"heightmap.bmp"s,
resource_directory_name + terrain_name + dds_file_extension,
resource_directory_name + terrain_name + L"_bump"s + dds_file_extension,
resource_directory_name + terrain_name + L"_vs"s + hlsl_file_extension,
resource_directory_name + terrain_name + L"_ps"s + hlsl_file_extension};
constexpr auto ENABLE_FULLSCREEN = false;
constexpr auto ENABLE_VSYNC = false;
constexpr auto SCREEN_WIDTH = 1366;
constexpr auto SCREEN_HEIGHT = 768;
auto window = std::make_shared<bm::Window>(SCREEN_WIDTH, SCREEN_HEIGHT, ENABLE_FULLSCREEN);
window->registerClass();
window->create();
auto d3d11_renderer = std::make_shared<bm::D3D11Renderer>(SCREEN_WIDTH, SCREEN_HEIGHT, ENABLE_FULLSCREEN, window->getHandle(), ENABLE_VSYNC);
// Exception is thrown in the following ponter, but d3d11 device should be already initialized.
auto terrain = std::make_shared<bm::Terrain>(d3d11_renderer->getDevice(), resources[0].c_str(), resources[1].c_str(), resources[2].c_str());
auto terrain_shader = std::make_shared<bm::TerrainShader>(d3d11_renderer->getDevice(), resources[3].c_str(), resources[4].c_str());
auto fps_camera = std::make_shared<bm::FPSCamera>(static_cast<float>(SCREEN_WIDTH), static_cast<float>(SCREEN_HEIGHT));
fps_camera->setPosition(500.f, 75.f, 400.f);
fps_camera->setRotation(20.f, 30.f, 0.f); // in degree.
auto direct_input_8 = std::make_shared<bm::DirectInput8>(window->getHandle());
constexpr float CLEAR_COLOR[] = {0.84f, 0.84f, 1.f, 1.f};
while(window->update())
{
direct_input_8->update(fps_camera->getMoveLeftRight(), fps_camera->getMoveBackForward(), fps_camera->getYaw(), fps_camera->getPitch());
fps_camera->update();
d3d11_renderer->clearScreen(CLEAR_COLOR);
terrain->render(d3d11_renderer->getDeviceContext());
terrain_shader->render(d3d11_renderer->getDeviceContext(),
terrain->getIndexCount(),
fps_camera->getWorld(),
fps_camera->getView(),
fps_camera->getProjection(),
{0.82f, 0.82f, 0.82f, 1.0f},
{-0.0f, -1.0f, 0.0f},
terrain->getColorTexture(),
terrain->getNormalMapTexture());
d3d11_renderer->swapBuffers();
}
return 0;
}
P.S.
And I know there's a 6 years old article on the website: CreateBuffer throwing an "Access violation reading location"
But it hardly explains any thing, since I have no global variables and pointers. I'd like to correct my old mistake, so I'll be glad to specify anything if needed.
Sorry but this is all abstraction code, so we can't see the actual place where you are calling D3D11CreateDevice.
That said, the symptom you describe sounds like you don't have the right Debug Device SDK layer installed on your operating system. You are likely failing to check for a FAILED HRESULT from D3D11CreateDevice as well.
DWORD createDeviceFlags = 0;
#ifdef _DEBUG
createDeviceFlags |= D3D11_CREATE_DEVICE_DEBUG;
#endif
ComPtr<ID3D11Device> device;
ComPtr<ID3D11DeviceContext> context;
D3D_FEATURE_LEVEL fl;
HRESULT hr = D3D11CreateDevice( nullptr, D3D_DRIVER_TYPE_HARDWARE,
nullptr, createDeviceFlags, nullptr,
0, D3D11_SDK_VERSION, &device, &fl, &context );
if (FAILED(hr))
...
On a system without the Debug Device SDK layer installed, this will fail in _DEBUG.
On Windows 8.x or Windows 10, installing the legacy DirectX SDK does not install any debug runtime.
For Windows 8.x you can get the Direct3D 11 Debug Runtime by installing the Windows 8.x SDK or Windows 10 SDK.
For Windows 10, you get the Direct3D Debug Runtime by installing a Windows optional feature named Graphics Tools. For Windows 10, this is version specific so make sure you have it enabled so it has the one that matches your release. See this blog post
See Anatomy of Direct3D 11 Create Device
After my first successful attempt at a 3D engine using Java and OpenGL (LWJGL3), I have decided to try my hand at Vulkan, using C++.
I have barely any experience with C/C++ and I am aware of the steep learning curve of Vulkan. This is however not a problem.
I decided to follow this tutorial: https://vulkan-tutorial.com/Introduction
It has showed me how to create a new project with Vulkan using XCode (as I am on Mac OS Mojave). I would, however, like to continue the rest of the tutorial using CLion as I would be switching between multiple operating systems.
I tried my hand at creating a CLion project and succeeded in making my first CMakeLists file, however something seems to be wrong. The file currently consists of the following:
cmake_minimum_required(VERSION 3.12)
project(VulkanTesting)
set(CMAKE_CXX_STANDARD 14)
add_executable(VulkanTesting main.cpp)
include_directories(/usr/local/include)
include_directories(/Users/[username]/Documents/Vulkan/SDK/vulkansdk-macos-1.1.92.1/macOS/include)
target_link_libraries(VulkanTesting /usr/local/lib/libglfw.3.3.dylib)
target_link_libraries(VulkanTesting /Users/[username]/Documents/Vulkan/SDK/vulkansdk-macos-1.1.92.1/macOS/lib/libvulkan.1.dylib)
target_link_libraries(VulkanTesting /Users/[username]/Documents/Vulkan/SDK/vulkansdk-macos-1.1.92.1/macOS/lib/libvulkan.1.1.92.dylib)
# Don't know if I need the next two lines
link_directories(/usr/local/lib)
link_directories(/Users/[username]/Documents/Vulkan/SDK/vulkansdk-macos-1.1.92.1/macOS/lib)
The reason I showed the above file will become apparent in the question.
The 'Program' so far is the following:
#define GLFW_INCLUDE_VULKAN
#include <GLFW/glfw3.h>
#include <iostream>
#include <stdexcept>
#include <functional>
#include <cstdlib>
#include <vector>
const int WIDTH = 800;
const int HEIGHT = 600;
class HelloTriangleApplication {
public:
void run() {
initWindow();
initVulkan();
mainLoop();
cleanup();
}
private:
GLFWwindow* window;
VkInstance instance;
void initWindow(){
glfwInit();
glfwWindowHint(GLFW_CLIENT_API, GLFW_NO_API);
glfwWindowHint(GLFW_RESIZABLE, GLFW_FALSE);
window = glfwCreateWindow(WIDTH, HEIGHT, "My first Vulkan window", nullptr, nullptr);
}
void initVulkan() {
createInstance();
}
void createInstance(){
// Instantiate Application Info
VkApplicationInfo applicationInfo = {};
applicationInfo.sType = VK_STRUCTURE_TYPE_APPLICATION_INFO;
applicationInfo.pApplicationName = "Hello Triangle";
applicationInfo.applicationVersion = VK_MAKE_VERSION(1,0,0);
applicationInfo.pEngineName = "No Engine";
applicationInfo.engineVersion = VK_MAKE_VERSION(1,0,0);
applicationInfo.apiVersion = VK_API_VERSION_1_0;
// Instantiate Instance Creation Info
VkInstanceCreateInfo createInfo = {};
createInfo.sType = VK_STRUCTURE_TYPE_INSTANCE_CREATE_INFO;
createInfo.pApplicationInfo = &applicationInfo;
// Get GLFW platform specific extensions
uint32_t glfwExtensionCount = 0;
const char** glfwExtensions;
glfwExtensions = glfwGetRequiredInstanceExtensions(&glfwExtensionCount);
// Fill in required extensions in Instance Creation Info
createInfo.enabledExtensionCount = glfwExtensionCount;
createInfo.ppEnabledExtensionNames = glfwExtensions;
// For validation layers, this is a later step in the tutorial.
createInfo.enabledLayerCount = 0;
// Create the Vulkan instance, and check if it was successful.
VkResult result = vkCreateInstance(&createInfo, nullptr, &instance);
if(result != VK_SUCCESS){
std::cout << "glfwExtensionCount: " << glfwExtensionCount << "\n";
std::cout << "glfwExtensionNames: " << &glfwExtensions << "\n";
std::cout << "result: " << result << "\n";
throw std::runtime_error("Failed to create Vulkan Instance");
}
}
void mainLoop() {
while(!glfwWindowShouldClose(window)){
glfwPollEvents();
}
}
void cleanup() {
glfwDestroyWindow(window);
glfwTerminate();
}
};
int main() {
HelloTriangleApplication app;
try {
app.run();
} catch (const std::exception& e) {
std::cerr << e.what() << std::endl;
return EXIT_FAILURE;
}
return EXIT_SUCCESS;
}
The problem I am having is that when I try to run the program, it will not create a VkInstance. The function returns VK_ERROR_INCOMPATIBLE_DRIVER. Now, I doubt that the driver is in fact incompatible as I have run the demo applications that came with the Vulkan SDK for one, and for another I have been able to run the exact same 'program' in XCode. When I investigated the problem a bit further, I noticed that the glfwGetRequiredInstanceExtensions function returns no extensions when the program is run in CLion like this, but does return one in the XCode equivalent.
This all leads me to believe that there is something I have done wrong in linking the libraries/frameworks in the Cmake file because I am aware of the fact that Vulkan is not directly supported in Mac OS, but instead (somehow?) passes through a layer to communicate with Metal.
Do I need to specify a way for the program to pass its Vulkan functionality through a Metal layer, and is this done automagically in XCode, or is there another problem with my approach?
Any help would be greatly appreciated!
You might want to look at the MacOS Getting Started Guide on the LunarXchange website and in your SDK. There is a section at the end that shows how to use CMake to build a Vulkan app and run it on MacOS. You also may want to use the FindVulkan CMake module instead of manually setting the include directories and the target link libraries.
But my first guess about your specific problem is that you may not be setting the VK_ICD_FILENAMES environment variable. You are correct in your observation that there is no direct support for Vulkan. Instead, the support is provided by the MoltenVK library which is treated as a Vulkan driver. But this "driver" is not installed in any system directory by the SDK. The SDK is just unzipped in your home directory structure, so you must tell the Vulkan loader where to find it via this environment variable.
Again, the CMake section at the end of the Getting Started Guide demonstrates the use of this environment variable. And the entire guide goes into additional detail about how the various Vulkan and MoltenVK components work.
My goal is to build a Game Boy emulator. In order to do this, I would like to embed an SDL2 surface into a wxWidgets window.
I found this tutorial: http://code.technoplaza.net/wx-sdl/part1/, but my program crashes as soon as I run it. However I suspect this was intended for SDL1.2. Part of the program is shown below.
It seems that if I call SDL_Init() and also attempt to show a wxFrame (which, in this case, is MainWindow), it shows the window for a second and then the program crashes. I commented all other calls to SDL in my program so far, so it seems the problem lies with calling Show() on a wxFrame and initing SDL2 in the same program.
So the question is: can SDL2 and wxWidgets 3 work together? If not, could you guys suggest to me good alternatives a GUI of a Game Boy emulator? Does wxWidgets have its own graphics frame like Qt does (I'd rather avoid Qt)?
Thanks very much!
#include "MainApp.h"
#include "MainWindow.h"
#include <stdexcept>
namespace GBEmu {
static void initSDL() {
//This and SDL_Quit() are the only calls to the SDL library in my code
if (SDL_Init(SDL_INIT_EVERYTHING) < 0) {
throw std::runtime_error("Fatal Error: Could not init SDL");
}
}
bool MainApp::OnInit()
{
try {
//If I comment out this line, the MainWindow wxFrame shows up fine.
//If I leave both uncommented, the window shows up quickly and then
//crashes.
initSDL();
//If I comment out this line and leave initSDL() uncommented,
//the program will not crash, but just run forever.
(new MainWindow("GBEmu", {50,50}, {640,480}))->Show();
} catch(std::exception &e) {
wxLogMessage(_("Fatal Error: " + std::string(e.what())));
}
return true;
}
int MainApp::OnExit() {
SDL_Quit();
return wxApp::OnExit();
}
}
wxIMPLEMENT_APP(GBEmu::MainApp);
EDIT: Here is more information on how it crashes: It crashes with a Segfault in what seems to be the pthread_mutex_lock disassembly file. This is the output in the console with stack trace:
Starting /home/dan/Documents/devStuff/GBEmuWx-build/GBEmuWx...
The program has unexpectedly finished.
/home/dan/Documents/devStuff/GBEmuWx-build/GBEmuWx crashed
Stack trace:
Error: signal 11:
/home/dan/Documents/devStuff/GBEmuWx-build/GBEmuWx(_ZN5GBEmu7handlerEi+0x1c)[0x414805]
/lib/x86_64-linux-gnu/libc.so.6(+0x36ff0)[0x7fb88e136ff0]
/lib/x86_64-linux-gnu/libpthread.so.0(pthread_mutex_lock+0x30)[0x7fb88c12ffa0]
/usr/lib/x86_64-linux-gnu/libX11.so.6(XrmQGetResource+0x3c)[0x7fb88d1ca15c]
/usr/lib/x86_64-linux-gnu/libX11.so.6(XGetDefault+0xc2)[0x7fb88d1a7a92]
/usr/lib/x86_64-linux-gnu/libcairo.so.2(+0x94dcf)[0x7fb88af8edcf]
/usr/lib/x86_64-linux-gnu/libcairo.so.2(+0x97110)[0x7fb88af91110]
/usr/lib/x86_64-linux-gnu/libcairo.so.2(cairo_surface_get_font_options+0x87)[0x7fb88af63e07]
/usr/lib/x86_64-linux-gnu/libcairo.so.2(+0x2b61f)[0x7fb88af2561f]
/usr/lib/x86_64-linux-gnu/libcairo.so.2(+0x2ef95)[0x7fb88af28f95]
This is a screenshot of where it seems to fail (line 7):
Update: In my MainWindow class, I attach a menu bar to the window. However, it seems when I comment out the setting of the menu bar, the window will show up fine even with initing of SDL. The menu bar will show up fine if I have initSDL() commented out but not the setting of the menu bar. Here is where I set the menu bar:
MainWindow::MainWindow(const wxString &title, const wxPoint &pos, const wxSize &size)
:wxFrame(nullptr, wxIDs::MainWindow, title, pos, size){
wxMenu *fileMenu = new wxMenu;
fileMenu->Append(wxID_EXIT);
wxMenuBar *menuBar = new wxMenuBar;
menuBar->Append(fileMenu, "&File");
//commenting this line out will allow the window to showup
//and not crash the program
SetMenuBar(menuBar);
}
You are experiencing an old heisenbug.
The workaround is simple: you have to initialize SDL before wxWidgets (basically, before GTK). To achieve this, you have to change
wxIMPLEMENT_APP(GBEmu::MainApp);
to
wxIMPLEMENT_APP_NO_MAIN(GBEmu::MainApp);
so that wxWidgets doesn't hijack your main().
Then you have to create main() manually. In it, initialize SDL, then call wxEntry():
int main(int argc, char** argv)
{
if (SDL_Init(SDL_INIT_EVERYTHING) < 0)
{
std::cerr << "Could not initialize SDL.\n";
return 1;
}
return wxEntry(argc, argv);
}
More about the bug:
I have googled around a bit and found that this bug has come up in a few places over the years. There are open reports in many bug trackers that have stack traces very similar to the one you get here (with debug symbols).
The oldest report I could find is from 2005 (!!) from the cairo bug tracker (https://bugs.freedesktop.org/show_bug.cgi?id=4373).
My best guess is that the real hiding place of this bug in either in GTK, cairo, or X. Unfortunately I do not currently have the time to look into it more in depth.
Original question:
I've been asked prior to a job interview to understand how an
anti-aliased line is drawn in a framebuffer, using C or C++. I haven't
used C, and it's been a few years for me since last using C++. I am a
complete beginner when it comes to graphics. My C++ experience has
mostly been in simple command-line programs and sorting methods. The
company does not care if I grab the code online, they want me to
understand it but still have a working executable.
I've used this tutorial to set up SDL libraries in MS VC++ 2012
Express, and this algorithm for the actual anti-aliasing. I have
a good understanding of the algorithm, though I'm currently having
trouble getting it to compile. I just want a line to be drawn, and
then I can go forward with setting the code up to the skeleton class
definitions I was given. This is what I have included aside from what
is on that page with the algorithm:
#include <cmath>
#include <math.h>
#include "conio.h"
#include "stdlib.h"
#include "stdio.h"
#include "SDL.h"
const double HEIGHT = 240;
const double WIDTH = 320;
const double X0 = 25.6;
const double X1 = 64.7;
const double Y0 = 30;
const double Y1 = 42;
int round(double number)
{
return number < 0.0 ? ceil(number - 0.5) : floor(number + 0.5);
}
void main()
{
Uint32 pixelColor = 00000000000000000000000000000000;
SDL_Surface* myScreen = SDL_CreateRGBSurface(SDL_ALPHA_OPAQUE,WIDTH,HEIGHT,32, 0x000000FF,
0x0000FF00, 0x00FF0000, 0xFF000000);
WULinesAlpha(X0, X1, Y0, Y1,pixelColor,myScreen);
return;
}
I'm getting the following errors:
Error 21 error LNK2019: unresolved external symbol _SDL_main
referenced in function _main Error 22 error LNK1120: 1 unresolved
externals
I've seen a few code examples saying the main function has to look
like this:
int main(int argc, char *argv[])
{
}
Again, graphics stuff is unfamiliar to me so I know my main function
is likely very wrong; I'm anticipating some shaking heads. Can someone
explain what is happening/what I need to do?
New:
I have now replaced my main function with the following code, based on NomNomNom069's YouTube video: "C++ SDL Tutorial 2 Creating a Screen and Handling Basic Input"
#include "SDL.h"
int main(int argc, char * args[])
{
bool running = true;
//initialize SDL
if (SDL_Init(SDL_INIT_EVERYTHING) == -1)
{
running = false;
}
//set up screen
SDL_Surface *screen;
screen = SDL_SetVideoMode(WIDTH, HEIGHT, 32, SDL_HWSURFACE);
if (screen == NULL)
{
running = false;
}
SDL_Event occur;
//main application loop
while (running)
{
SDL_PollEvent(&occur);
if (occur.type == SDL_QUIT)
{
running = false;
}
//drawing occurs here
SDL_FillRect(screen, NULL, 0);
SDL_Flip(screen);
}
//quit SDL
SDL_Quit();
return 0;
}
No errors, and I get a window to pop up. Awesome.
My question now is regarding how/where to call WuLinesAlpha. This function calls for 4 doubles, a Uint32 variable, and an SDL_Surface*. I have my doubles, I set the Uint32 to 0x000000FF, and I assume that the SDL_Surface I have set up as screen is the one passed in.
I've toyed around with where the WuLinesAlpha function call goes and I keep getting the black screen. I thought, as explained in the video, it would go in the loop but nothing has happened. Are there any more SDL commands I should be calling?
Fix your main declaration first. This does need to be int main(int argc, char *argv[]). Especially on Windows, since I believe SDL.h actually renames your main to some other name, and takes over main for the library itself.
Next, make sure you link against SDL properly. In my own SDL 1.2.x based project I have these lines in my Makefile:
SDL_CFLAGS := $(shell sdl-config --cflags)
SDL_LFLAGS := $(shell sdl-config --libs)
I then later append those flags to my actual CFLAGS and LFLAGS. Note that if you use make and Makefiles, you want to use := there, otherwise make will invoke the $(shell ...) command every time it expands $(CFLAGS).
I can't help you set up Microsoft's GUI products. This tutorial, for a slightly older MSVC product (2010), looks pretty good, and may put you on the right track: http://lazyfoo.net/SDL_tutorials/lesson01/windows/msvsnet2010e/index.php
And finally, don't forget to call SDL_Init() at some point, preferably before you start creating surfaces.
Good luck!
I've been following this tutorial, and I've got to the point where we are instructed how to load and use bitmaps. Here is the current code:
#include "SDL/SDL.h"
#include <stdlib.h>
int main (int arg, char *argc[]) {
SDL_Surface* screen = NULL;
SDL_Surface* hello = NULL;
SDL_Init(SDL_INIT_EVERYTHING);
screen = SDL_SetVideoMode (256, 256, 32, SDL_SWSURFACE);
hello = SDL_LoadBMP("hello world.png"); // Here
if (hello == NULL) exit(0);
SDL_BlitSurface(hello, NULL, screen, NULL);
SDL_Flip(screen);
SDL_Delay(2000);
SDL_FreeSurface(hello);
SDL_Quit();
return 0;
}
hello, however, never gets any value other than NULL. I'm using Code::Blocks, and even if I scatter hello world.png through all possible directories of the project (be it inside bin, obj, either of the Debug's, the directory with the .cbp) or specify the whole path to the image in-code (as in SDL_LoadBMP("C:\Dir\hello world.png")) hello will get NULL.
What am I doing wrong?
OS is Windows
EDIT: Alright, apparently SDL_LoadBMP can only load .bmp files! How silly of me.
Edit: I mixed SDL_LoadBMP with IMG_Load, here's my new answer:
You can use SDL Image (include SDL_Image.h, link SDL_image.lib and make sure that the correct DLL for you file type is with your binary, if you need one) to call IMG_Load. IMG_Load will take care to resolve your file type and use the appropriate loaded to create a new SDL Surface. Remember to Free your surface when you're done with it.
Yes, it only loads bitmaps. You can use SDL_Image to load other types such as a png.