How to output low ASCII using C++ in Windows 10? - c++

I'm trying to output directional arrows for a simple snake game in C++ on Windows 10. However, using this table as reference:
ASCII reference
All I got is this tiny question mark in the console:
Tiny question mark
I would like to output the symbols 16, 17, 30 and 31. I'm not much of programmer so it could be some basic mistake, but some symbols do work while others result in that symbol above.
A small example:
void showSnake() {
char snakeHead;
snakeHead = 31;
cout << snakeHead; //THIS SHOWS THE TINY QUESTION MARK
snakeHead = 62;
cout << snakeHead; //THIS SHOWS THE ">" SYMBOL
}

You should use Unicode, you'll have much more choices for characters.
On https://en.wikipedia.org/wiki/List_of_Unicode_characters I found this symbol '▶' which looks similar to what you wanted to use.
Its unicode value is U+25BA which means you can create a character with a value of '\u25BA' in C++.
In practice however that value would go outside the range of the char type so you have to use wide characters (wchar) to get the job done.
As per this answer you should also toggle support for Unicode character in stdout using the _setmode function (see here) from the C run-time library.
#include <iostream>
#include <io.h>
#include <fcntl.h>
int main() {
_setmode(_fileno(stdout), _O_U16TEXT);
std::wcout << L'\u25BA';
}

Related

Unicode output not showing

I'm trying to learn Unicode programming in Windows.
I have this simple program:
#include <iostream>
#include <string>
int main()
{
std::wstring greekWord = L"Ελληνικά";
std::wcout << greekWord << std::endl;
return 0;
}
However, it outputs nothing. Any ideas how to make it output Greek?
I tried adding non-Greek letters, and that didn't work quite right either.
The first thing to try is to make the program not dependent on the encoding of the source file. So use Unicode escapes not literal Unicode letters
std::wstring greekWord = L"\u0395\u03BB\u03BB\u03B7\u03BD\u03B9\u03BA\u03AC";
Having the incorrect encoding in the source file is only one thing of many things that could be preventing you from printing Greek. The other obvious issue is the ability of your terminal to print Greek letters. If it can't do that, or needs to be set up correctly so that it can then nothing you do in your program is going to work.
And probably you want to fix the source code encoding issue, so that you can use unescaped literals in your code. But that's dependent on the compiler/IDE you are using.
If you are outputting your cout to a normal console then the console doesn't usually support unicode text like greek, try setting it up for unicode text or find another way to output your data, like txt files or some gui,
There are two way to do this.
The old, non-standard Microsoft way is as follows:
#include <fcntl.h>
#include <io.h>
int main()
{
_setmode(_fileno(stdout), _O_U16TEXT);
_setmode(_fileno(stdin), _O_WTEXT);
// your code here
}
You will fild this everywhere, but this is not necessarily a good way to solve this problem.
The more standards-compliant way is as follows:
#include <locale>
int main()
{
std::locale l(""); // or std::locale l("en_US.utf-8");
std::locale::global(l); // or std::wcout.imbue(l); std::wcin.imbue(l);
// your code here
}
This should work with other modern compilers and operating systems too.
TRY this it works with me :
#include
#include <io.h>
#include <fcntl.h>
using namespace std;
int main() {
_setmode(_fileno(stdout),_O_U16TEXT);
wcout<<L"Ελληνικά";
setlocale(LC_ALL,"");
return 0;
}

Unicode to integer conversion visual studio bug

Im trying to convert a unicode character to an integer and encountered a bug in visual studio not sure if its a bug or something im doing wrong
The project has unicode character set and not multibyte.
#include <windows.h>
#include <iostream>
int main()
{
constexpr int a = L'🦀';
printf("%i\n", a);
std::cout << a << std::endl;
return 0;
}
Problem:
Mouse hovering variable 'a' shows that its 129408 or 0x1F980 which is correct but when it prints it out to the console i get 55358
I have created a new project and wrote the same code and it printed out the correct value but after switching the same project from unicode to multibyte and back to unicode it produces this issue, not sure how to fix this.
Wide characters in Visual Studio are only 16 bits, meaning they won't hold a value greater than 65535. You're getting the first half of the character encoded in UTF-16, which is d83e dd80.

Error 'LC_TYPE' was not declared in this scope

I'm writing program in c++ that is supposed to change letters in text to uppercase letters(Program works, but setlocale is not working). But it is giving me Error. [Error] 'LC_TYPE' was not declared in this scope. It "should" work because it is from my official faculty literature.
#include <iostream>
#include <string>
using namespace std;
int main() {
cout << "Write something: " << endl;
string tekst; //tekst=text
getline(cin, tekst);
setlocale(LC_TYPE, "croatian"); // here is problem...
for (char znak : tekst){ //znak=char, symbol...
char velikoSlovo = toupper(znak); // velikoSlovo=uppercaseLetter
cout << velikoSlovo;
}
cout << endl;
return 0;
}
Anyone knows how to fix this??
I'm using Orwell Dev C++ 5.9.2. Language standard (-std) is ISO C++ 11.
Here is picture.
Don't you need to include #include <clocale> as it is said here
Edit:
Actually #include <locale.h> should be preferred to <clocale> to reduce portability issues. Thanks to #Cheers for mentioning it in the comments.
As the documentation says, you should use LC_CTYPE:
http://www.cplusplus.com/reference/clocale/setlocale/
Also you need to provide valid locale code, as described here
So for chroatian, your line should look like:
setlocale(LC_CTYPE, 'hr_HR.UTF-8'); // or just "hr_HR"
or you can just use:
setlocale(LC_ALL, "")
which should set the localization to default locale used by your computer.
And as suggested before, you may also need to add #include <clocale>
Add
#include <locale.h>
to get a declaration of setlocale and its associated constants.
In general just read the relevant documentation of whatever function or associated things the compiler doesn't seem to recognize.
In general the chosen approach may work with single-byte oriented encodings, but not with multibyte encodings such as UTF-8. Single-byte encodings are commonly used in Windows. In Unix-land it's UTF-8 that rules.
And for Windows, you generally want the system's default locale, so instead of
setlocale(LC_TYPE, "croatian");
you may (but is not necessarily, but may with a good chance) be better served by
setlocale(LC_ALL, "");
where the empty string selects the system locale rather than the default pure ASCII "C" locale.
Also, note that toupper from the C library requires a non-negative argument, or else the special value EOF. You can just cast the argument to unsigned char, when you know that it's not EOF.

Getline Function Messing with Code

Picking up C++ and having a go at it on OS X 10.9 using XCode 5.0.2 and using Alex Allain as reference.
The following code compiles just fine and outputs correctly
#include <iostream>
#include <string>
using namespace std;
int main()
{
std::string user_first_name = "test";
std::cout << user_first_name << "\n";
return 0;
}
When I add a getline function, code appears to compile but with no output.
#include <iostream>
#include <string>
using namespace std;
int main()
{
std::string user_first_name = "test";
std::getline( std::cin, user_first_name, '\n' );
std::cout << user_first_name << "\n";
return 0;
}
In fact debug navigator has memory filling up with bars (although actual memory use is fixed at 276 KB). Why am I getting stumped on such a simple thing/concept?
I did a bit of digging around and its quite likely this is related to a text encoding issue. I'm using defaults which is Unicode (UTF-8). Encoding is not something I'm familiar with, never something I had to deal with when learning on Windows. How do I get past this?
I can't comment regarding the use of XCode or OS X, but it was my understanding that std::cin always gives you a narrow (single-byte) character stream. In Windows (at least with Visual Studio), I think it works whether you compile for UTF8 (single-byte for all ASCII characters) or UTF16 (2-bytes for all ASCII characters). The runtime library presumably does the conversion for you as necessary.
I'm not sure what "filling up with bars" means, but maybe it's just that you're looking at uninitialized memory. If you think that it is an encoding issue, perhaps try using wstring/wcin instead of string/cin and see if that helps.

About Use of #include Preprocessor Directive in C++

I stumble on the following compilation error in C++ with g++ compiler:
error on line line 1 with message:
invalid preprocessing directive #a
(with a caret above the character a) which is followed by another,probably consequent, error on line 4 with message:
cout was not declared in this scope.
The editor i am using is Code blocks 10.05 with mingw.I tried removing .h extension from the iostream file include statement;switching among different File encoding options;and replacing the angular bracket with single quotes and double quotes as well.i am stuck on it.Pardon if it is a duplicate(although i went through several already asked questions in relevance).
The following code illustrates the problem:
#‪include ‬<iostream.h>
int main()
{
cout<< "abc"+8;
cout<< "def"+4;
cout<< "ha";
return 0;
}
cout exists within the namespace std
So either
#‪include‬<iostream>
//...
std::cout << "abc" << 8;
//...
or
#‪include‬<iostream>
using namespace std;
//...
or
#‪include‬<iostream>
using std::cout;
//...
I tend to prefer the 1st if I'm only using it once or twice, The second if I'm using a lot of different pieces from a namespace (and only in a cpp file), or the third if I'm only using a piece or 2 from a namespace but using the same couple many times.
Additionally as stated in the comments, don't use the 2nd one in headers. See: "using namespace" in c++ headers
Also, you have an invalid character in your #include. You can see it in a hex editor or Note how stackoverflow doesn't highlight them the same:
#‪include‬<iostream>
#include<iostream>
Fully working code:
#include<iostream>
using std::cout;
int main()
{
cout << "abc" << 8;
cout << "def" << 4;
cout << "ha";
return 0;
}
Produces the following output abc8def4ha after I corrected for trying to add 8 to a char*
You have to use std::cout, which means that the "cout" keyword is part of the standard library.
The "invalid directive" error is caused by some invisible Unicode characters in the #include directive; perhaps you copied this from a website that embedded some formatting characters in the code. They can be seen in the question, if you look at the source in a hex editor. That error should be fixed by deleting and retyping the #include line.
You'll probably have other errors, since the code is fifteen years out of date; most modern compilers don't provide pre-standard libraries. These days, the standard library headers don't have a .h extension:
#include <iostream>
and nearly all the names they declare are scoped inside the std namespace:
std::cout << "ha";
Finally, "abc"+8 doesn't do anything sensible. The string literal is an array of four characters, and +8 tries to give you a pointer to the ninth character, which doesn't exist. The result is undefined behaviour.
If you want to print "abc" followed by "8", then you want:
std::cout << "abc" << 8;
Try using it like this:-
#‪include ‬<iostream>
using std :: cout;
cout is the part of std library
If you've got caret above a, try retyping your #include.
You might accidentally type alternative i which looks similar but has different code.
Suggestions about std:: are only relevant for the second error you're getting.
I also didn't fully understand what you were trying to achieve with "abc"+8.