printing colored string in cpp - c++

I have a problem in colorizing a special string or character in cpp.
I tried this code:
#include <iostream>
#include <stdlib.h>
using namespace std;
#define red "\x1B[31m"
#define reset "\033[0m"
#define blue "\x1B[34m"
#define green "\x1B[32m"
#define yellow "\x1B[33m"
int main()
{
cout << red << "red text" << reset << endl;
return 0;
}
I expected to print red text by red color.It works in online compilers.
But in ( dev c++ ) and code::blocks ,it is output :
←[31mred text←[0m
How can I fix it?

Might be easier and more portable to use the Windows API.
SET_FOREGROUND_COLOR .. SET_BACKGROUND_COLOR .. etc.
If you use the Windows header you can easily find workable code to implement it.

Related

How can I get the encoding from a GNU gettext .mo file in C++

I am using GNU gettext for translations and it works.
The following test code shows the idea of my implementation:
#include <iostream>
#include <stdio.h>
#include <stdlib.h>
#include <libintl.h>
#include <locale.h>
#define _(STRING) gettext(STRING)
int main()
{
/* Setting the i18n environment */
setlocale (LC_ALL, "");
bindtextdomain ("hello", getenv("PWD"));
textdomain ("hello");
/* Example of i18n usage */
std::cout << _("Hello World!") << std::endl;
setenv("LANGUAGE", "fr", 1);
std::cout << _("Hello World!") << std::endl;
return EXIT_SUCCESS;
}
I have a working .mo file, so when I run the program I get:
Hello World!
Bonjour le monde!
So far so good.
But I have to forward the translated strings to a 3rd party application, and there I need to indicate the encoding (Latin 1, Latin 9, Cyrillic, UTF-8, etc).
How can I get the encoding at run time?
Trying to find out the encoding in use for translated strings is a little bit of guess work. But you can enforce a certain encoding by calling bind_textdomain_codeset(DOMAINNAME, CODESET), see https://man7.org/linux/man-pages/man3/bind_textdomain_codeset.3.html.

C++ Changing Text Color

I'm trying to change the color of the text in c++, the only answer I can find is for C and not C++. I have tried using conio.h but don't understand how to use it. Could anyone help with this?
Text coloring isn't really on the C++ side. In some unix terminals you could simply use codes like \e[0;31m message \e[0m directly in your program (although you might want to create an API for ease of use). However, this wouldn't work in a Windows console. It depends on the OS and terminal being used.
If you don't need to stick to non cross-platform library conio.h. I recommend to use cross-platform solution: header only, moderc C++ rang library. I use it in most of my projects, it's really easy to use
I found out how to change the color of text using windows.h. Here is an example of the code I used (copied from https://cboard.cprogramming.com/).
#include <iostream>
#include <windows.h>
using namespace std;
int main()
{
HANDLE h = GetStdHandle(STD_OUTPUT_HANDLE); // h is your link to the console
SetConsoleTextAttribute(h, 1); cout << "Sentence in blue" << '\n'; // 1 happens to be blue
SetConsoleTextAttribute(h, 2); cout << "Sentence in green" << '\n'; // 2 is green
SetConsoleTextAttribute(h, 4); cout << "Sentence in red" << '\n'; // 4 is red
SetConsoleTextAttribute(h, 7); cout << "Sentence in white" << '\n'; // etc.
}

plt::show() on macOS doesn't display anything, only a white blank screen

I used the minimal.cpp (snippet provided below)using the matplotlib-cpp library to display the plot but all I get to see is a blank white box with no plot. However I'm able to see the saved plot in the directory.
#define _USE_MATH_DEFINES
#include <iostream>
#include <cmath>
#include "matplotlibcpp.h"
namespace plt = matplotlibcpp;
int main()
{
plt::plot({1,3,2,4});
// save figure
const char* filename = "./basic.png";
std::cout << "Saving result to " << filename << std::endl;
plt::save(filename);
plt::show();
}
PS: I'm working from MacOSX environment and backend is TkAgg.
My end goal is to use this library to plot a stream of data in realtime.

How to use c++ to display unicode supplementary charactor ( > U+FFFF) in console app in Windows 10? (without winrt if possible)

It seems USC2 characters are okey with wchar_t.
However when a codepoint is more than U+FFFF, it just doesn't work.
#include <iostream>
#include <fcntl.h>
#include <io.h>
int main()
{
_setmode( _fileno(stdout), _O_U8TEXT ); // _O_WTEXT has no difference
std::wcout.sync_with_stdio(false);
std::wcout << L"z\u00df\u6c34\U0001F34C\n" << std::endl; // L"zß水🍌"
}
Print as below:
zß水�
How to print SMP characters on console app in Windows?
Is it possible without C++/WinRT?

Unicode character Visual C++

I'm trying to make my program work with unicode characters.
I'm using Visual Studio 2010 on a Windows 7 x32 machine.
What I want to print is the queen symbol ("\ul2655") and it just doesn't work. I've set my solution to use unicode.
This is my sample code:
#include <iostream>
using namespace std;
int main()
{
SetConsoleOutputCP(CP_UTF8);
wcout << L"\u2655";
return 0;
}
Also, I've tried many other suggestions, but nothing worked. (eg. change the cmd font, apply chcp 65001, which is the same as SetConsoleOutputCP(CP_UTF8), etc).
What is the problem? It's for the first time I encounter such a situation. On linux, it's different.
Thank you.
Once I managed to print chess pieces on the console; there are several complexities involved here.
First of all, you have to enable UTF-16 mode on stdout; this is described here as well as here, and it's exactly as Mehrdad explained.
#include <io.h>
#include <fcntl.h>
...
_setmode(_fileno(stdout), _O_U16TEXT);
Then, even if the output reached the console correctly, on the console you may get garbage instead of the intended characters; this comes form the fact that, at least on my machine (Windows 7), the default console font doesn't support the chess pieces glyphs.
To fix this, you have to choose a different TrueType font which supports them, but to make such a font available you have to go through some hoops; personally, I found out that DejaVu Sans Mono works just fine.
So, at this point, your code should work, and code like this (the example I wrote in the past to test this issue):
#include <wchar.h>
#include <stdio.h>
#include <locale.h>
#ifdef _WIN32
#include <io.h>
#include <fcntl.h>
#endif
enum ChessPiecesT
{
King,
Queen,
Rock,
Bishop,
Knight,
Pawn,
};
enum PlayerT
{
White=0x2654, /* white king */
Black=0x265a, /* black king */
};
/* Provides the character for the piece */
wchar_t PieceChar(enum PlayerT Player, enum ChessPiecesT Piece)
{
return (wchar_t)(Player + Piece);
}
/* First row of the chessboard (black) */
enum ChessPiecesT TopRow[]={Rock, Knight, Bishop, Queen, King, Bishop, Knight, Rock};
void PrintTopRow(enum PlayerT Player)
{
int i;
for(i=0; i<8; i++)
putwchar(PieceChar(Player, TopRow[Player==Black?i: (7-i)]));
putwchar(L'\n');
}
/* Prints the eight pawns */
void PrintPawns(enum PlayerT Player)
{
wchar_t pawnChar=PieceChar(Player, Pawn);
int i;
for(i=0; i<8; i++)
putwchar(pawnChar);
putwchar(L'\n');
}
int main()
{
#ifdef _WIN32
_setmode(_fileno(stdout), _O_U16TEXT);
#else
setlocale(LC_CTYPE, "");
#endif
PrintTopRow(Black);
PrintPawns(Black);
fputws(L"\n\n\n\n", stdout);
PrintPawns(White);
PrintTopRow(White);
return 0;
}
should work equally well on Windows and Linux.
Now you still have a problem: the glyphs will be too small to be meaningful in any way:
this can be solved only by enlarging the console font, but you'd get all the other characters to be way too big to be usable. Thus, all in all, probably the best fix is just to write a GUI application.
Try this instead
_setmode(_fileno(stdout), _O_U16TEXT);