I Have built a C++ Allegro Map Editor. One of the requests was to have a log so I've put it in the console window for every move that is made... Problem now is that the console window is under the main window (Used GFX_AUTODETECT_WINDOWED), But whenever I try to move that window, it simply crashes the program.. I need to be able to move it and to move the console window to and come back to the map editor. Anybody has any ideas???
Here's the main of my code.
#include <allegro.h>
#include <string>
#include <sstream>
#include "Layout.h"
#include "System.h"
#include "Map.h"
#include <iostream>
#include <fstream>
using namespace std;
// Allegro Functions to stabilize speed
volatile long speed_counter = 0;
void increment_speed_counter() // A function to increment the speed counter
{speed_counter++; }
END_OF_FUNCTION(increment_speed_counter);
int main()
{
System system; // Initialising Allegro
system.Setup();
Map map1; // Creating default map
map1.createMap();
BITMAP *buffer = create_bitmap(24,45); // Double buffering
LOCK_VARIABLE(speed_counter); //Used to set the timer - which regulates the game's
LOCK_FUNCTION(increment_speed_counter);//speed.
install_int_ex(increment_speed_counter, BPS_TO_TIMER(8));//Set our BP
/*game looop */
while( !key[KEY_ESC] )
{
clear_bitmap(buffer); // Clear the contents of the buffer bitmap
while(speed_counter > 0)
{
if(mouse_b &1 ){ // On mouse click
map1.catchMouseEvent(mouse_x, mouse_y);
while(mouse_b & 1){}
}
speed_counter --;
}
rectfill(buffer,0,0,25,45,makecol(135,206,250));
textprintf_ex(buffer, map1.getLayout().getFont(), 0, 0, makecol(255, 255, 255), -1,"%d", map1.getRowVal());
textprintf_ex(buffer, map1.getLayout().getFont(), 0, 20, makecol(255, 255, 255), -1,"%d", map1.getColVal());
blit(buffer, screen, 0, 0, 970, 50, 100, 50);
}
/*Free memory after */
destroy_bitmap( buffer );
return 0;
allegro_exit();
}
END_OF_MAIN();
Also, it does happen that it randomly crashes by itself without moving the window. There is not a specific reason, it just crashes at random times.
Any ideas someone?
Without seeing all of the code, it's impossible to know why or where it's crashing. If you use a debugger it should be obvious what's happening. You should be responding to return codes. e.g., When you load or create a bitmap, make sure it's not NULL.
I'm not really sure what you are trying to do with such a smaller double buffer. Typically you create a single buffer the same size as the window. Note that Allegro 4 will only work properly if the screen width is a multiple of four. Also, you should call set_color_depth(desktop_color_depth()) (before setting the graphics mode) for maximum compatibility.
Related
I wanna try to make a class that will be able to print wide character to console with specific RGB color. I know that console has only 16 of them but first take a look.
Every color in console palette can be changed by setting the right buffer, so I wrote something like that:
//ConsolePX
#include <fcntl.h>
#include <io.h>
#include <Windows.h>
#include <iostream>
class ConsolePX
{
public:
wchar_t source;
COLORREF foreground, background;
/* Set at the start ctor */
ConsolePX(wchar_t _what, COLORREF foregroundColor, COLORREF backgroundColor)
{
source = _what;
foreground = foregroundColor;
background = backgroundColor;
}
/* Draws wchar_t with colors to console */
void Draw() {
HANDLE outH = GetStdHandle(STD_OUTPUT_HANDLE);
CONSOLE_SCREEN_BUFFER_INFOEX curr, newBuff;
curr.cbSize = sizeof(CONSOLE_SCREEN_BUFFER_INFOEX);
GetConsoleScreenBufferInfoEx(outH, &curr);
curr.srWindow.Bottom++;
newBuff = curr;
newBuff.ColorTable[0] = background;
newBuff.ColorTable[1] = foreground;
SetConsoleScreenBufferInfoEx(outH, &newBuff);
SetConsoleTextAttribute(outH, 1);
_setmode(_fileno(stdout), _O_U16TEXT); //Sets console mode to 16-bit unicode
std::wcout << source << std::endl;
_setmode(_fileno(stdout), _O_TEXT);
//Restores to defaults
SetConsoleTextAttribute(outH, 7);
SetConsoleScreenBufferInfoEx(outH, &curr);
}
};
//Driver code
#include "ConsolePX.h"
int main()
{
ConsolePX(L'█', RGB(29, 219, 79), RGB(0, 0, 0)).Draw();
return 0;
}
And that worked but the problem is in the last line in ConsolePX(exactly SetConsoleScreenBufferInfoEx(outH, &curr)). After printing wchar_t I restored palette to defaults. Why is that a problem? I noticed that every char in a console isn't pinned to color but to the color palette index, so after restoring to defaults palette, I restored wchar_t color too. After deleting that line, I'll interfere with the rest of the code. Is there any way to block x, y character in the console to avoid color change?
Of important things, I'm using Visual Studio and, as you can guess, I'm using windows.
No.
You said it yourself: there are 16 colours available to you.
When you thought you'd bypassed that restriction, actually all you were doing was changing what those colours "mean", i.e. what RGB values they map to for that console.
The current palette applies to the whole content of the console. If it didn't, we wouldn't be limited to 16 colours.
So, while your attempt is inventive, I'm afraid it fundamentally cannot work.
If you want control over real colours like this, make a GUI application.
I have encountered a strange behaviour while attempting to write a roguelike. I've made a simple loop printing letters in a filled rectangle shape. With normal (stdscr) window, or newly initialised window from derwin() all works fine.
Loop within stdscr/newly initialised window from derwin().
But the issue starts to appear after I return the window pointer from the Game class. Letters seem to be printed without any patttern, and the window looks like it is covered on some parts of it.
Loop, when the pointer is returned.
I've tried debugging, but didn't succeed. The cursor is moving, loop is working, letters are printed, but sometimes they get stuck in the astral projection level, and they doesn't show up.
Here is the code: Game.cpp
Game::Game() : m_winMode(WinMode::GameWin) {
[...]
initscr();
wresize(stdscr, WIN_HGHT, WIN_WDTH);
m_gameWin = derwin(stdscr, GAMEWIN_HGHT, GAMEWIN_WDTH, 0, 0);
[...]
}
WINDOW * Game::getWindow(Game::WinMode t_mode) const {
[...]
switch (t_mode) {
case Game::WinMode::GameWin:
return m_gameWin;
break;
[...]
}
pdcurses-test.cpp - this is the main file
#include "stdafx.h"
#include "Game.h"
#include "Map.h"
int main() {
Game game;
game.prepareScreen();
WINDOW * test = game.getWindow(Game::WinMode::GameWin);
wclear(test);
for (int i = 0; i <= 48; i++) {
for (int y = 0; y <= 120; y++) {
mvwaddch(test, i, y, '%');
}
}
wrefresh(test);
Here is the full code: github.com/gebirgestein/pdcurses-test/tree/test/pdcurses-test/pdcurses-test
Thanks in advance.
Calling subwin creates and returns a pointer to a new window with the given number of lines, nlines, and columns, ncols. The window is at position (begin_y, begin_x) on the screen. (This position is relative to the screen, and not to the window orig.) The window is made in the middle of the window orig, so that changes made to one window will affect both windows. The subwindow shares memory with the window orig. When using this routine, it is necessary to call touchwin or touchline on orig before calling wrefresh on the subwindow.
Calling derwin is the same as calling subwin, except that begin_y and begin_x are relative to the origin of the window orig rather than the screen. There is no difference between the subwindows and the derived windows.
From Here. Try calling touchwin(stdscr) before calling wrefresh(test).
I am currently working on connecting OGRE and SFML.
SFML should be used for 2D drawing, network stuff and input.
OGRE is for the 3d Graphics.
Currently the whole thing is on Linux.
What works: Connecting OGRE and SFML. First I create a SFML Render Window, then I grab the handle of this window and give it to the OGRE Render WIndow while creating it. I can use the SFML Events now. Did not test the Network stuff, but I am sure this will work too.
What does not work: Drawing in the SFML window.
Case 1: SFML and OGRE are not connected. OGRE does not have the SFML window handle and has its own window. SFML still can't draw in its own window! The main loop executes a maximum of 3 times and then just stops. Nothing more happens. A few seconds later (about 20 or so) I get a Memory Access violation and the program ends.
Case 2: SFML and OGRE are connected. A similar thing happens: The main loop executes exectly 53 times, nothing gets drawn and then the program stops with the terminal message "aborted" (actually its "Abgebrochen", because it's in German)
The strange behaviour also happens, when I let SFML draw into a sf::RenderTexture instead of the sfml_window.
Here is my code:
#include <SFML/Graphics.hpp>
#include <SFML/Window.hpp>
#include <SFML/System.hpp>
#include <iostream>
#include <OGRE/Ogre.h>
#include <vector>
#include <stdio.h>
int main(int argc, char * argv[])
{
if(argc == 1)
return -1;
// start with "1" and you get 1 window, start with "0" and you get two
bool together = atoi(argv[1]);
// create the SFML window
sf::RenderWindow sfml_window(sf::VideoMode(800, 600), "test");
sf::WindowHandle sfml_system_handle = sfml_window.getSystemHandle();
sfml_window.setVerticalSyncEnabled(true);
std::cout<<sfml_system_handle<<std::endl;
// init ogre
Ogre::Root * ogre_root = new Ogre::Root("", "", "");
std::vector<Ogre::String> plugins;
plugins.push_back("/usr/lib/x86_64-linux-gnu/OGRE-1.8.0/RenderSystem_GL");
for(auto p : plugins)
{
ogre_root->loadPlugin(p);
}
const Ogre::RenderSystemList& render_systems = ogre_root->getAvailableRenderers();
if(render_systems.size() == 0)
{
std::cerr<<"no rendersystem found"<<std::endl;
return -1;
}
Ogre::RenderSystem * render_system = render_systems[0];
ogre_root->setRenderSystem(render_system);
ogre_root->initialise( false, "", "");
// create the ogre window
Ogre::RenderWindow * ogre_window= NULL;
{
Ogre::NameValuePairList parameters;
parameters["FSAA"] = "0";
parameters["vsync"] = "true";
// if started with 1, connect the windows
if(together) parameters["externalWindowHandle"] = std::to_string(sfml_system_handle);
ogre_window = ogre_root->createRenderWindow("ogre window", 800, 600, false, ¶meters);
}
// ogre stuff
Ogre::SceneManager * scene = ogre_root->createSceneManager(Ogre::ST_GENERIC, "Scene");
Ogre::SceneNode * root_node = scene->getRootSceneNode();
Ogre::Camera * cam = scene->createCamera("Cam");
Ogre::SceneNode * cam_node = root_node->createChildSceneNode("cam_node");
cam_node->attachObject(cam);
Ogre::Viewport * vp = ogre_window->addViewport(cam);
vp->setAutoUpdated(false);
vp->setBackgroundColour(Ogre::ColourValue(0, 1, 1));
ogre_window->setAutoUpdated(false);
ogre_root->clearEventTimes();
//sfml image loading
sf::Texture ring;
std::cout<<"ring loading: "<<ring.loadFromFile("ring.png")<<std::endl;
sf::Sprite sprite;
sprite.setTexture(ring);
// main loop
int counter = 0;
while(!ogre_window->isClosed() && sfml_window.isOpen())
{
std::cout<<counter<<std::endl;
counter++;
std::cout<<"a"<<std::endl;
// sfml events
sf::Event event;
while(sfml_window.pollEvent(event))
{
if(event.type == sf::Event::Closed)
{
sfml_window.close();
}
}
std::cout<<"b"<<std::endl;
std::cout<<"c"<<std::endl;
ogre_root->renderOneFrame();
std::cout<<"d"<<std::endl;
// here the strange behaviour happens
// if this line (draw) isn't present, everything works
sfml_window.pushGLStates();
sfml_window.draw(sprite);
sfml_window.popGLStates();
vp->update();
std::cout<<"e"<<std::endl;
sfml_window.display();
// only needs to be done for separated windows
// sfml display updates otherwise, both use double buffering
if(!together) ogre_window->update(true);
}
return 0;
}
Help would be really appreciated.
EDIT: I added the pushGLStates(); and popGLStates(); commands, forgot those earlier!
Not an answer really, but too long for a comment:
ogre_window = ogre_root->createRenderWindow("ogre window", 800, 600, false, ¶meters);
Are you sure that it's okay to pass the address of an object you destroy the very next line here with ¶meters?
I have been attempting to get a simple SDL2 program up to display an image and then exit. I have this code:
/* compile with `gcc -lSDL2 -o main main.c` */
#include <SDL2/SDL.h>
#include <SDL2/SDL_video.h>
#include <SDL2/SDL_render.h>
#include <SDL2/SDL_surface.h>
#include <SDL2/SDL_timer.h>
int main(void){
SDL_Init(SDL_INIT_VIDEO);
SDL_Window * w = SDL_CreateWindow("Hi", 0, 0, 640, 480, 0);
SDL_Renderer * r = SDL_CreateRenderer(w, -1, 0);
SDL_Surface * s = SDL_LoadBMP("../test.bmp");
SDL_Texture * t = SDL_CreateTextureFromSurface(r, s);
SDL_FreeSurface(s);
SDL_RenderClear(r);
SDL_RenderCopy(r, t, 0, 0);
SDL_RenderPresent(r);
SDL_Delay(2000);
SDL_DestroyTexture(t);
SDL_DestroyRenderer(r);
SDL_DestroyWindow(w);
SDL_Quit();
}
I am aware that I have omitted the normal checks that each function succeeds - they all do succeed, they were removed for ease of reading. I am also aware I have used 0 rather than null pointers or the correct enum values, this also is not the cause of the issue (as the same issue occurs when I correctly structure the program, this was a quick test case drawn up to test the simplest case)
The intention is that a window appear and shows the image (which is definitely at that directory), wait for a couple of seconds and exit. The result, however, is that the window appears correctly but the window is filled with black.
An extra note SDL_ShowSimpleMessageBox() appears to work correctly. I don't know how this relates to the rest of the framework though.
I'll just clear this, you wanted to make a texture, do it directly to ease control, plus this gives you better control over the image, try using this code, fully tested, and working, all you wanted was for the window to show the image and close within 2 seconds right?. If the image doesn't load then it's your image's location.
/* compile with `gcc -lSDL2 -o main main.c` */
#include <SDL.h>
#include <SDL_image.h>
#include <iostream> //I included it since I used cout
int main(int argc, char* argv[]){
bool off = false;
int time;
SDL_Init(SDL_INIT_VIDEO);
SDL_Window * w = SDL_CreateWindow("Hi", 0, 0, 640, 480, SDL_WINDOW_SHOWN);
SDL_Renderer * r = SDL_CreateRenderer(w, -1, SDL_RENDERER_ACCELERATED);
SDL_Texture * s = NULL;
s = IMG_LoadTexture(r, "../test.bmp"); // LOADS A TEXTURE DIRECTLY FROM THE IMAGE
if (s == NULL)
{
cout << "FAILED TO FIND THE IMAGE" << endl; //we did this to check if IMG_LoadTexture found the image, if it showed this message in the cmd window (the black system-like one) then it means that the image can't be found
}
SDL_Rect s_rect; // CREATES THE IMAGE'S SPECS
s_rect.x = 100; // just like the window, the x and y values determine it's displacement from the origin which is the top left of your window
s_rect.y = 100;
s_rect.w = 640; //width of the texture
s_rect.h = 480; // height of the texture
time = SDL_GetTicks(); //Gets the current time
while (time + 2000 < SDL_GetTicks()) //adds 2 seconds to the past time you got and waits for the present time to surpass that
{
SDL_RenderClear(r);
SDL_RenderCopy(r, s, NULL, &s_rect); // THE NULL IS THE AREA YOU COULD USE TO CROP THE IMAGE
SDL_RenderPresent(r);
}
SDL_DestroyTexture(s);
SDL_DestroyRenderer(r);
SDL_DestroyWindow(w);
return 0; //returns 0, closes the program.
}
if you wanted to see a close button on the window and want it to take effect then create an event then add it to the while area to check if it's equal to SDL_Quit();, I didn't include it since you wanted it to immediately close within 2 seconds, thus, rendering the close button useless.
HAPPY CODING :D
When using SDL_RENDERER_SOFTWARE for the renderer flags this worked. Also it worked on a different machine. Guess there must be something screwed up with my configuration, although I'm not sure what it is because I'm still getting no errors shown. Ah well, mark as solved.
I believe this to be (not 100% sure, but fairly sure), due to this line of code:
SDL_Renderer * r = SDL_CreateRenderer(w, -1, 0);
According to the SDL wiki article SDL_CreateRenderer, the parameters required for the arguments that you are passing in are as follows:
SDL_Window* window
int index
Uint32 flags
You are passing in the pointer to the window correctly, as well as the index correctly, but the lack of a flag signifies to SDL that SDL should use the default renderer.
Most systems have a default setting for applications for which renderer should be used, and this can be modified on a application by application basis. If no default setting is provided for a specific application, the render look up immediately checks the default render settings list. The SDL wiki briefly refers to this list by the following line at the bottom of the remarks section:
"Note that providing no flags gives priority to available SDL_RENDERER_ACCELERATED renderers."
What's not explained here in the wiki is that the "renderers" the wiki is referring to comes from the default renderer list.
This leads me to believe that you have either changed a setting somewhere in the course of history of your computer, or elsewhere in you visual studio settings that is resulting in no list to be found. Therefore you must explicitly inform SDL which renderer to use because of your machine settings. Otherwise using an argument of 0 should work just fine. In the end this doesn't hurt as it's better to be explicit in your code rather than implicit if at all possible.
(That said, all of my deductions are based off of the fact that I am assuming that everything you said that works, works. If this is not true, then the issue could be because of a vast variety of reasons due to the lack of error checking.)
I'm Porting some code from Windows to XLib. In the windows code, I can force a redraw by calling InvalidateRect and then handling the corresponding WM_PAINT message. However, I am having trouble finding out how to do this in X11/XLib. I see there is an Expose message but not sure if that is the same thing.
If it matters, I need to do this to force the window to render at a certain frame rate for an OpenGL based program.
To expand slightly on the useful answers given by BЈовић,
With raw Xlib you can draw at any time in a single thread, because every Xlib function specifies the full display, window, and context. AFAIK, with multithreading all bets are off.
You also must have an Expose event handler, and select for those events, if you're in a desktop environment. And it won't hurt to have one even if you're writing a full screen program.
Most toolkits are not as flexible and only draw in a designated event handler (but much nicer to use in many other ways) and have some equivalent to the Windows InvalidateRect. In raw Xlib you get the same effect by sending yourself an Expose event. Doing so won't lead to any real performance problems and will make the code more understandable by other programmers, and easier to port, so you might as well.
There are also XClearArea and XClearWindow functions which will generate Expose events for you, but they first erase part/all with the background color, which might lead to flickering.
With OpenGL it gets a bit more complicated because you have to work with GLX as well. I have a very simple OpenGL/Xlib program online at
http://cs.anu.edu.au/~hugh.fisher/3dteach/
which might be useful as an example.
You need to handle Expose events. This tutorial explains with an example how to handle Expose events :
#include <stdio.h>
#include <stdlib.h>
#include <X11/Xlib.h>
#include <X11/Xutil.h>
#include <X11/Xos.h>
#include <X11/Xatom.h>
#include <X11/keysym.h>
/*Linux users will need to add -ldl to the Makefile to compile
*this example.
*/
Display *dis;
Window win;
XEvent report;
GC green_gc;
XColor green_col;
Colormap colormap;
/*
Try changing the green[] = below to a different color.
The color can also be from /usr/X11R6/lib/X11/rgb.txt, such as RoyalBlue4.
A # (number sign) is only needed when using hexadecimal colors.
*/
char green[] = "#00FF00";
int main() {
dis = XOpenDisplay(NULL);
win = XCreateSimpleWindow(dis, RootWindow(dis, 0), 1, 1, 500, 500, 0, BlackPixel (dis, 0), BlackPixel(dis, 0));
XMapWindow(dis, win);
colormap = DefaultColormap(dis, 0);
green_gc = XCreateGC(dis, win, 0, 0);
XParseColor(dis, colormap, green, &green_col);
XAllocColor(dis, colormap, &green_col);
XSetForeground(dis, green_gc, green_col.pixel);
XSelectInput(dis, win, ExposureMask | KeyPressMask | ButtonPressMask);
XDrawRectangle(dis, win, green_gc, 1, 1, 497, 497);
XDrawRectangle(dis, win, green_gc, 50, 50, 398, 398);
XFlush(dis);
while (1) {
XNextEvent(dis, &report);
switch (report.type) {
case Expose:
fprintf(stdout, "I have been exposed.\n");
XDrawRectangle(dis, win, green_gc, 1, 1, 497, 497);
XDrawRectangle(dis, win, green_gc, 50, 50, 398, 398);
XFlush(dis);
break;
case KeyPress:
/*Close the program if q is pressed.*/
if (XLookupKeysym(&report.xkey, 0) == XK_q) {
exit(0);
}
break;
}
}
return 0;
}
I may have misunderstood the question. If you want to create Expose events in your application, you can create and set expose event, and send it using XSendEvent.