Airplane symbol using Unicode C++ - c++

I'm trying to print the Airplane symbol using Unicode in my CodeBlocks. I found out that the code of Airplane is \u2708. So I tried the following code:
#include <iostream>
using namespace std;
int main() {
wchar_t a = '\u2708';
cout << a;
return 0;
}
It outputs 40072 when I replace wchar_t with char
char a = '\u2708';
I get this symbol: ł
Im really stuck, thanks for any help.

If you are in Linux and dont use codepage conversion from unicode to console try this:
std::locale::global(std::locale(""));
wchar_t plane = L'\u2708';
std::wcout << plane << std::endl;
In windows is a bit more complicated, you need a compatible unicode font on the default console and the correct codepage.

Related

Farsi character utf8 in c++

i m trying to read and write Farsi characters in c++ and i want to show them in CMD
first thing i fix is Font i add Farsi Character to that and now i can write on the screen for example ب (uni : $0628) with this code:
#include <iostream>
#include <io.h>
#include <fcntl.h>
using namespace std;
int main() {
_setmode(_fileno(stdout), _O_U16TEXT);
wcout << L"\u0628 \n";
wcout << L"ب"<<endl;
system("pause");
}
but how i can keep this character ... for Latin characters we can use char or string but how about Farsi character utf8 ?!
and how i can get them ... for Latin characters we use cin>>or gets_s
should i use wchar_t? if yes how?
because with this code it show wrong character ...
wchar_t a='\u0628';
wcout <<a;
and i can't show this character بـ (uni $FE91) even though that exist in my installed font but ب (uni $0628) showed correctly
thanks in advance
The solution is the following line:
wchar_t a=L'\u0628';
The use of L tells the compiler that your type char is a wide char ("large" type, I guess? At least that's how I remember it) and this makes sure the value doesn't get truncated to 8 bits - thus this works as intended.
UPDATE
If you are building/running this as a console application in Windows you need to manage your code pages accordingly. The following code worked for me when using Cyrillic input (Windows code page 1251) when I set the proper code page before wcin and cout calls, basically at the very top of my main():
SetConsoleOutputCP(1251);
SetConsoleCP(1251);
For Farsi I'd expect you should use code page 1256.
Full test code for your reference:
#include <iostream>
#include <Windows.h>
using namespace std;
void main()
{
SetConsoleOutputCP(1256); // to manage console output
SetConsoleCP(1256); // to properly process console input
wchar_t b;
wcin >> b;
wcout << b << endl;
}

GetComputerName() not displaying Unicode correctly in Windows console

I'm relatively new to WinAPI programming in C++. I'm trying to write a program that will obtain the system hostname using GetComputerName(). Ideally, I want the code to be able to work on English and non-English systems. Below is the code that I'm using:
int main()
{
wstring hostname;
wchar_t nbtName[MAX_COMPUTERNAME_LENGTH + 1];
DWORD length = MAX_COMPUTERNAME_LENGTH + 1;
GetComputerName(nbtName, &length);
hostname = nbtName;
wcout << hostname << endl;
return 0;
}
The code works fine on my English Windows 7 system, but the code doesn't seem to display properly on my German Windows 7 system (which uses German characters for the hostname). I thought that wstring and wchar_t could handle these special characters. Here's what's displayed on my German Windows 7 system.
COMPUTER-Í─▄▀
Am I overlooking something stupid? Thanks!
Use _setmode(_fileno(stdout), _O_U16TEXT) to show Unicode in console window:
#include <iostream>
#include <string>
#include <io.h> //for _setmode
#include <fcntl.h> //for _O_U16TEXT
int main()
{
_setmode(_fileno(stdout), _O_U16TEXT);
std::wcout << L"ελληνικά\n";
return 0;
}
Or use MessageBoxW(0, hostname.c_str(), 0, 0) or OutputDebugStringW to see Unicode text displayed correctly:
this function has two versions GetComputerNameW for unicode and GetComputerNameA for ANSI. if UNICODE is defined in your environment the the first one will be called. So either make sure UNICODE is defined or try to call the GetComputerNameW directly.
Windows Console output was the culprit. The Unicode characters display correctly in other non-console output. Thanks everyone!

Win32 C++: How to print a smily face ☻ onto console output stream? [duplicate]

This question already has answers here:
Output unicode strings in Windows console app
(16 answers)
Closed 7 years ago.
I tried setlocale, wchar, usual char... how to define:
char c = '☻'
cout << c << endl;
to print onto console anething (this is relevant to VS2012 and http://ideone.com/LJtDUz)
The problem is that windows by default does not handle Unicode characters and the windows console is not able to display them.
A solution I found at This CPP Thread is as follows:
#include <fcntl.h>
#include <io.h>
#include <iostream>
int
main(int argc, char* argv[])
{
wchar_t* c = L"☻"; /* needed because the project is using 'Unicode Character Set' */
_setmode(_fileno(stdout), _O_WTEXT); /* set stdout to UTF16 */
std::wcout << c << std::endl; /* use a wide cout call to output characters */
getchar();
return 0;
}
When you try and use the ☻ symbol directly in your code, windows will tell you that i needs to convert your files to Unicode, make sure to accept that and use Unicode Character Set as your projects character set type.

Unicode Windows console application (WxDev-C++/minGW 4.6.1)

I'm trying to make simple multilingual Windows console app just for educational purposes. I'm using c++ lahguage with WxDev-C++/minGW 4.6.1 and I know this kind of question was asked like million times. I'v searched possibly entire internet and seen probably all forums, but nothing really helps.
Here's the sample working code:
#include <iostream>
using namespace std;
int main(int argc, char *argv[])
{
/* English version of Hello world */
wchar_t EN_helloWorld[] = L"Hello world!";
wcout << EN_helloWorld << endl;
cout << "\nPress the enter key to continue...";
cin.get();
return 0;
}
It works perfectly until I try put in some really wide character like "Ahoj světe!". The roblem is in "ě" which is '011B' in hexadecimal unicode. Compiler gives me this error: "Illegal byte sequence."
Not working code:
#include <iostream>
using namespace std;
int main(int argc, char *argv[])
{
/* Czech version of Hello world */
wchar_t CS_helloWorld[] = L"Ahoj světe!"; /* error: Illegal byte sequence */
wcout << CS_helloWorld << endl;
cout << "\nPress the enter key to continue...";
cin.get();
return 0;
}
I heard about things like #define UNICODE/_UNICODE, -municode or downloading wrappers for older minGW. I tried them but it doesn't work. May be I don't know how to use them properly. Anyway I need some help. In Visual studio it's simple task.
Big thanks for any response.
Apparently, using the standard output streams for UTF-16 does not work in MinGW.
I found that I could either use Windows API, or use UTF-8. See this other answer for code samples.
Here is an answer, not sure this will work for minGW.
Also there are some details specific to minGW here

How to print subscripts/superscripts on the screen in C++?

Is it possible to print subscripts/superscripts ?
for example like that : x²
what are functions allow to do that ?
This depends entirely on the environment you're running in. For a GUI system (Windows, Mac, Qt, etc.) you would need to consult the API documentation. For a text mode system, the best you could do is use specific characters in your current encoding. For instance, unicode has certain code points that are super- or sub-scripts of other characters.
If you're using a GUI, you can change the size and orientation of the font.
There are also superscript and subscript characters available in Unicode that could be used.
You can print the appropriate Unicode symbol, to cout or wcout depending on locale:
#include <iostream>
int main()
{
std::cout << "x\u00b2" << std::endl;
}
or
#include <iostream>
#include <locale>
int main()
{
std::locale::global(std::locale("de_DE.UTF8"));
std::wcout << L"x\u00b2" << std::endl;
}