I'm developping a prototype application to generate bitmap fonts giving TTF.
I'm using Qt library which I'm using for years. However, I realized I've never been involved into "characters" issues before.
What I'm trying is quiet simple:
I need to draw into a PNG file each character of the 1252 codepage. I'm a bit lost with the different issues related to codec, textdecoder etc.
Any suggestion is welcome !
Z.
The codec that you want is QTextCodec::codecForName("Windows-1252")
The characters that you want are char(32) to char(255); put those in a char[225]. Don't forget to zero-terminate them.
Convert that char[225] to a QString with the codec from (1)
Draw the QString from (4)
Related
I am attempting to put the final touches on a maze program I have been writing. I have used Unicode to delimit the walls and paths however because of the (horizontal) line spacing I can't quite get it to look compact enough. I have attached two screenshots. I'm just escaping the newline "\n" in order to print each row. Can the distance between lines be adjusted or am I stuck with this "gappy" representation?
My output:
What I am trying to closely represent:
Assuming you aren't printing double newlines, this is outside the scope of standard C++, it does not provide facilities for controlling terminal in a standard way.
Solutions:
You could provide a launcher script, which opens a new terminal window with specific font and runs your app in it.
You could use some platform specific method to change background color (ANSI codes work in unixy terminals, or use Win32 API for Windows terminal, ncurses library on Unix-like environments) and print just spaces in different colors.
Use a GUI library/framework to get complete control on what is drawn (I'd use Qt for C++ GUI app).
TBH if you want pixel-accurate rendering use a proper rendering API, such as OpenGL.
From a text rendering point of view you don't say what you are rendering to. Assuming something like a terminal console or shell window then the layout beyond characters and newlines is nothing to do with your program; the visual representation is entirely determined by the shell you are rendering to.
Firstly, check that you are genuinely printing a line per maze scan line, and not interleaving with spurious newlines. Assuming that is ruled out, the problem is that the unicode glyph is not a full block. So you must somehow set the font or choose another glyph which is a full block.
Usually console windows are 80 characters wide by 22 or 24 characters high, and characters are 8 pixels wide by 19 pixels high. So it's very far from a square grid, and you might want to bias the maze to reflect that and provide a better visual appearance (eg make 2 pixel-wide vertical corridors much more common than 2-pievel wide horizontal corridors).
Do check the binary image library fonts, you might find them useful.
https://github.com/MalcolmMcLean/binaryimagelibrary
I'd like to ask what's the simplest way of writing the chess unicode characters in a console window in C++? (♙♘♗♖♕♔♟♞♝♜♛♚) They are part of the "Miscellaneous Symbols" block in unicode. https://en.wikipedia.org/wiki/Chess_symbols_in_Unicode
I also want to print characters with square size, right now my chess board is not square, because each character is a rectangle and not a square.
It'd also be good to be able to write with ordinary non-square characters below the chess board, but that might be impossible? To mix different fonts/formattings in the same console window?
Ok, thanks in advance! :)
The first part of your question, outputting those characters, is platform-dependent. Linux consoles often use UTF-8, if that is the case you can encode the characters as UTF-8 and write them to standard output. On Windows you can use the Console API (the WriteConsole function):
HANDLE handle = GetStandardHandle(STD_OUTPUT_HANDLE);
DWORD written = 0;
// explicitly call the wide version (which always accepts UTF-16)
WriteConsoleW(handle, L"\u2658", 1, &written, NULL);
One caveat which is hard to work around is that you need a console font containing those characters.
For getting square cells, this is dependent on a lot of specifics about the way the console renders text. If it uses font substitution, then there is a chance the text will not actually be monospaced.
Now, if you have a console font with these characters, and if that font is monospaced, then you may be able to draw a square board by adding some spacing between the characters. You can use block elements like ▌ U+258C — LEFT HALF BLOCK to draw the chequerboard: ▌♘▐█▌ ▐.
I am working on an embedded platform (STM32F407) with a TFT LCD as a display (480x800px) and would like to make my user interface somewhat customizable to the end user. I figured the best source of fonts would be windows compatible as their the most common.
My current implementation uses my own custom drawn font in a binary format and a descriptor table giving the character width and ascii value but having to draw my own font bit by bit is tedious.
I would like to read in a True Type Font file from an SD card and be able to use the different sized glyphs inside it but I have not seen a strait forward implementation on how to actually achieve this magic. Can somebody point me to a good c/c++ example of what I am looking for?
Even better as a way to iron out the kinks I would like to make a simple gcc command line program that will print out my input with a selected font using '#' as pixels. That way I can just worry about implementation and not any other random bugs that might pop up.
Can anybody help me out?
Perhaps you can use the Freetype library.
As duskwuff says: TTF is primarily a vector format, would need to write a renderer. Better off using an image file to define the font, or using a bitmap font format like FNT (Windows) or BDF (UNIX).
Here is my answer to my own question: AngelCode's BMFont & Useage. This makes choosing selective characters from the installed char set, mix in a font and exports an image with a map file to each character. Simple to use.
I want to show an unicode text in my sdl program , but it doesnt render on screen correctly.
It renders from end to begining and the characters render seperatly (They should connect to each other)
You can see a screen shot here http://up.vatandownload.com/images/ea8d1c2kxpk5ehbjv2.png
SDL does not implement full Unicode text layout. It works for many languages, but Arabic (which has incredibly complex layout and glyph-selection rules) is not one of them. You will need to use either Pango or ICU's layout class to do your text layout if you need Arabic support.
I have a game that requires me to allow players to chat with each other via network. All is well, except the part where players can type in Unicode input.
So, the question can be split into two parts:
When players type, how do I capture input? I have done this before via the game input handling (polling), however, it is not as responsive as something like Windows Forms.
After I capture input into a string, how do I output it using TrueType Fonts? The reason I ask this is because usually, I would build bitmap fonts at the start of the game from the all the text used in the game. But with unicode input, there are nearly 10k characters that are needed, which is quite impossible to build at the start of the game.
P.S. My target input languages are more specific to Chinese, Korean and Japanese.
For Input
Use SDL_EnableUNICODE to enable unicode input handling
Receive the SDL_KeyboardEvent as usual
Use the unicode member of SDL_keysym to get the unicode
For Rendering
If the needed font size is reasonably small, say 16px, you actually could just render it all to a single texture, you can fit a minimum of 4096 glyphs on 1024x1024 texture at that size, a bit more when you pack them tightly (see fontgen for example code). That should be enough for common chat, but not enough to fit all the glyphs of a TTF file.
If you don't want to use a larger texture size you have to generate the fonts on demand, to do that just create the Texture's as usual and then use glTexSubImage2D to upload new glyphs to the texture.
Another alternative is to not use textures for glyphs, but for the text itself. That way you bypass all the trouble that glyph generation produces. But its probably not such a good idea for non-static editable text.
When players type, how do I capture
input?
That depends on what you use I guess. I'm not familiar with SDL. On Linux you can use standard X functions and event loop, it works well (used in Quake for example; so it should be reactive enough).
After I capture input into a string, how do I output it using TrueType Fonts?
You should have a look at FreeType2 library. It lets you load TrueType fonts, and retrieve the glyph (the image) of any character.
But with unicode input, there are nearly 10k characters that are needed, which is quite impossible to build at the start of the game.
I have the same problem. I guess a cache manager with MRU (most recently used) characters would do the trick. I bit more complicated than simple static bitmap though.
Here is some code showing how to capture keyboard input with SDL.
First of all you need to query key input from SDL by calling EventPoll.
You can do that whenever you are ready to process input, or regularly in
a fixed interval and store keys and keyboard status in internal tables.
void EventPoll (ulong mask)
{
SDL_Event event;
while (SDL_PollEvent (&event)) {
switch(event.type) {
case SDL_KEYDOWN:
KeyHandler (reinterpret_cast<SDL_KeyboardEvent*> (&event));
break;
case SDL_KEYUP:
KeyHandler (reinterpret_cast<SDL_KeyboardEvent*> (&event));
break;
// handle other events
}
}
}
void KeyHandler (SDL_KeyboardEvent *event)
{
SDLKey keySym = event->keysym.sym;
wchar_t unicode = event->keysym.unicode;
int keyState = (event->state == SDL_PRESSED);
// process key info, e.g. put key into a buffer and
// store keyboard state
}
Here is a link to a document describing methods to render text with OpenGL: http://www.opengl.org/resources/features/fontsurvey/
What you may want to do is to capture keyboard input and render it on the fly using the proper font(s) you have preloaded.
I've done no game development myself, so I have just a vague idea of how things work there, but here are my 2 cents:
Don't cache all the glyphs at the start of the program. Instead, when you have to display a chat string, render the whole string on-the-fly to some new texture. Keep this texture in memory until a time when it is unlikely that it will be needed again (say, after the chat window is closed). Perhaps you can re-render the whole chat window when it gets updated - then you would only have one texture to worry about.
As far as display goes, I've had very good luck with the caching system described in this tutorial, transliterated to C++.
For fonts, GNU Unifont has full BMP glyph coverage, available in convenient TTF form.