I'm new to allegro 5. I'm currently writing a simple 2D game in C++.
I used to use Allegro 4 but there was no major support for .PNG images so I changed it. The problem is that in Allegro 4 I could easily create a double buffer for my sprites so they do not flicker nor blink while moving. In allegro 5 there is just "al_draw_bitmap" function which does not let us give any buffer for an argument.
This part of my code looks like this:
al_draw_bitmap(image[0],poz_x,poz_y,0);
al_draw_bitmap(platf, 0, 400, 0);
al_draw_bitmap(p1, poz_p1x, poz_p1y, 0);
al_draw_bitmap(chmurka, 50, 50, 0);
al_flip_display();
al_clear_to_color(al_map_rgb(0,0,0));
I can't find any solution on the internet.
I would be really pleased if you could help me.
Related
I'm working on a little project using C++ and Allegro 5, my question is
Is there a way to draw antialiased primitives to a bitmap using Allegro 5?
I mean I'm using this function
void draw_to_gameBuffer(ALLEGRO_BITMAP *&gameBuffer, ALLEGRO_DISPLAY *&display)
{
static float x = 0;
al_set_target_bitmap(gameBuffer);
al_draw_filled_rectangle(0,0, 350, 622, al_map_rgb(130, 80, 120));
al_draw_filled_circle(x, 200, 100, al_map_rgb(12, 138, 129));
al_draw_filled_triangle(0, 0, 100, 0, 50, 100, al_map_rgb(12, 138, 129));
x += 2.7;
if(x > 350 + 100)
x = -250;
al_set_target_backbuffer(display);
}
to draw a cicle and a triangle (testing purposes) over a target bitmap as shown, on the project display options I have
al_set_new_display_option(ALLEGRO_SAMPLE_BUFFERS, 4, ALLEGRO_SUGGEST);
al_set_new_display_option(ALLEGRO_SAMPLES, 8, ALLEGRO_SUGGEST);
to enable antialiasing, the problem is that all primitives rendered on the gameBuffer have jaggies but the primitives rendered outside the gameBuffer are perfectly smooth, how can I solve that? Or is there a way to do what I'm trying to do and get smooth primitives drawn on the gameBuffer?
It seems that Allegro 5 has some experimental features that can be enabled defining this macro (before defining any allegro header):
#define ALLEGRO_UNSTABLE
With this macro enabled we are able to create a bitmap to draw anything we want and do whatever we want with the bitmap, yes we can do this without enabling the ALLEGRO_UNSTABLE macro but there's this new procedure called:
al_set_new_bitmap_samples(int numberOfSamplesDesired);
that defines how many samples we want in a newly created bitmap, "the downside" of this implementation is that it's only going to work with OpenGl so... we need to force OpenGl (Windows only) to see this new feature working and, how do we put this together and draw antialiased primitives over bitmaps?
first of all, we need to define the ALLEGRO_UNSTABLE macro
#define ALLEGRO_UNSTABLE
then we need to add some allegro headers (the ones we need)
#include <allegro5/allegro.h>
#include <allegro5/allegro_primitives.h>
after this we need to define a bitmap
ALLEGRO_BITMAP *buffer = nullptr;
now we enable OpenGl and the flags we want and the display options
al_set_new_display_flags(ALLEGRO_WINDOWED | ALLEGRO_RESIZABLE | ALLEGRO_OPENGL);
al_set_new_display_option(ALLEGRO_SAMPLE_BUFFERS, 2, ALLEGRO_SUGGEST); //antialias stuff
al_set_new_display_option(ALLEGRO_SAMPLES, 4, ALLEGRO_SUGGEST); //antialias stuff
to create our bitmap we first call the experimental procedure then create our bitmap
al_set_new_bitmap_samples(4);
buffer = al_create_bitmap(bufferWidth, bufferHeight);
4 samples as an example (lol), to draw to the bitmap we use a procedure like this
void draw_to_buffer(ALLEGRO_BITMAP *&buffer, ALLEGRO_DISPLAY *&display)
{
al_set_target_bitmap(buffer);
al_clear_to_color(al_map_rgb(0, 0, 0));
//anything you want to draw to the bitmap
al_set_target_backbuffer(display);
}
now we just need to draw our bitmap the way we usually draw bitmaps on screen
al_draw_bitmap(buffer, 0, 0, 0);
and that's pretty much it, if you have issues or questions with this implementation, yo can look at my post in the Allegro forums where I got the help of many lovely allegro users that taught me a lot without knowing.
my post in the Allegro forum
I recently started learning DirectX programming in C++ and I've encountered a minor problem.
The thing is that my excercise requires me to change colors in a DirectX application from one to another. I have a function which does that:
void render_frame(void)
{
// start the random generator
srand(time(NULL));
// clear the window to random color
d3ddev->Clear(0, NULL, D3DCLEAR_TARGET, D3DCOLOR_XRGB(rand() % 255, rand() % 255, rand() % 255), 1.0f, 0);
// begin/end/display scene
d3ddev->BeginScene();
d3ddev->EndScene();
d3ddev->Present(NULL, NULL, NULL, NULL);
}
It works, but it's dependant on the time() delay and I wonder if there's a better counterpart in Direct3D library. For example if I could call this function in order with specific color and delay in chain like this:
void render_frame_red(void);
delay(1000);
void render_frame_blue(void);
delay(1000);
I know that function Sleep() exists in standard library, but it literally freezes my application window (I can't interact with it during the delay). Maybe I can set time to different tickrate so it'll update as frequently as I want it to?
I know it's a small thing and I would probably figure it out later on my journey with 3D programming, but it wouldn't hurt if I knew more about how things work in DirectX.
The general model for Direct3D is to render frames "as fast as possible" and then have time-based operations in your program deal with animation. Traditional GDI-style drawing only paints 'on-demand', which you can do with Direct3D but makes things harder.
In other words, use clock-time instead of frame-count to control animation and effects. For a good timer solution for Direct3D rendering, see this blog post.
BTW, don't invest your time in learning legacy Direct3D 9 which relies heavily on supporting code from the deprecated DirectX SDK and requires the legacy DirectSetup to deploy the ancient helper library D3DX9 (see MSDN). Use DirectX 11. See DirectX Tool Kit.
This question already has answers here:
SDL2 program only works if Renderer is created with SDL_RENDERER_SOFTWARE
(2 answers)
Closed 6 years ago.
When I use Software Rendering in my SDL2 project, everything works as expected.
e.g when the code for creating a SDL_Renderer looks like this:
this->renderer = SDL_CreateRenderer(this->window, -1, SDL_RENDERER_SOFTWARE);
When I use Hardware rendering, e.g when the code for creating a SDL_Renderer looks like this:
this->renderer = SDL_CreateRenderer(this->window, -1, SDL_RENDERER_ACCELERATED);
For some reason, calling SDL_UpdateWindowSurface results in a SEGFAULT. I call SDL_UpdateWindowSurface in a standard way:
// Update window surface
SDL_UpdateWindowSurface(window);
I know that the way my window is setup is correct, since running the program with a Software Rendering SDL_Renderer works just fine, and doesn't yield a SEGFAULT.
Does anybody have any idea what might be causing this? Has anybody experienced something like this before?
SDL_GetWindowSurace documentation says You may not combine this with 3D or the rendering API on this window.. Either generate surface and update window surface yourself (and forget about hardware acceleration) or use SDL_Renderer and never touch window surface (it doesn't even exist for accelerated backend).
I've created a Direct3D Device and have a render loop that starts with clearing the window to transparent red using the following code:
d3ddev->Clear(0, NULL, D3DCLEAR_TARGET, D3DCOLOR_ARGB(100, 255, 0, 0), 1.0f, 0);
This works perfectly, but only when it's created in the entry point script for some reason. When trying to do the exact same thing in a different script and creating the Direct3D window inside a function called by my entry point script's WinMain, the alpha channel of the Direct3D fails and it shows plain red. The WindowProc functions are both the same for the single-script Direct3D window and the multiple-scripts one.
I hope there are some thing I could check without the need of posting the code here, but if needed I am able to create minimal versions of the two to post.
Any help is very much appreciated!
I am currently crosscompiling a Sprite Engine under mingw.
Therefore i have 2 Questions.
The behavior of SDL is Emulated by Emscripten through the WebGL Layer.
i don't even have to link the SDL libraries when compiling with emcc.
Question is:
If i initalize my App Like this:
if(SDL_Init(SDL_INIT_VIDEO|SDL_INIT_AUDIO) == -1)return -1;
SDL_Surface *screen= SDL_SetVideoMode(640, 480, 24, SDL_SWSURFACE);
SDL_FillRect(screen, 0, SDL_MapRGB(screen->format, 255, 0, 0));
SDL_Flip(screen);
then i am NOT able to put text into a textfield of the Browser, but i am getting the SDL_Events.
All other Browser Inputs like checkboxes or selectboxes are working.
If initialize my App like this (Emscripten works also without SDL_Init!):
SDL_Surface *screen= SDL_SetVideoMode(640, 480, 24, SDL_HWSURFACE);
SDL_FillRect(screen, 0, SDL_MapRGB(screen->format, 255, 0, 0));
SDL_Flip(screen);
then i am able to put tet into the browser textfield but i am not getting SDL_Events.
Is there a workaround to get the Browser Text Input Fields and SDL_Events working?
Question
This line of code compiled on my WIN32 System fills the screen blue
SDL_FillRect(screen,NULL, SDL_MapRGB(screen->format, 255, 0, 0));
the same line compiled with Emscripten fills the screen red.
Is there a way to switch the SDL colors in Emscripten or in the SDL headers?
From your native code, add this before the SDL_CreateWindow call:
SDL_SetHint(SDL_HINT_EMSCRIPTEN_KEYBOARD_ELEMENT, "#canvas");
More info here: https://wiki.libsdl.org/SDL_HINT_EMSCRIPTEN_KEYBOARD_ELEMENT
Emscripten, by default, captures all user events to the page. This makes sense for a fullscreen game for example. In your use case, you probably want to modify Emscripten's SDL_Init to not listen to key events, or change its receiveEvent return value.
Had the same problem. For me, doing this fixed it:
Module['doNotCaptureKeyboard'] = true;