I am currently trying to create a game using the windows console. It is supposed to be a raytracer (in the fashion of Wolfenstein 3D). I need to shade objects that are further away with different unicode characters called "shaded blocks".
The problem is, that these characters seem to bug out the Windows console. The "full block" looks as it is supposed to look:
However, when I use the "DARK SHADE" character, the console seems to bug out and displays this:
I then did some other testing. When pasting the character into the CMD, it is more slim, and taller:
However, when going into the consol properties and exiting it again, it squishes the blocks together, and adding masses of space characters after them (I added a dot to the end of the line so you can see where the spaces end):
I used Consolas with 16pt for all of them.
Can you tell me how to display these characters properly?
Thanks in advance.
[This is not an answer. It's a comment that requires code.]
Maybe try this:
#define UNICODE
#define NOMINMAX
#define STRICT
#define WIN32_LEAN_AND_MEAN
#include <windows.h>
#include <iostream>
#include <string>
using namespace std;
namespace winapi {
HANDLE const std_output = GetStdHandle( STD_OUTPUT_HANDLE );
void write( wchar_t const* const s, int const n )
{
DWORD n_written;
WriteConsole( std_output, s, n, &n_written, 0 );
}
void write( wstring const& s )
{
write( s.c_str(), s.length() );
}
}
auto main()
-> int
{
wchar_t const light_shade = L'░';
wchar_t const medium_shade = L'▒';
wchar_t const dark_shade = L'▓';
winapi::write( wstring( 64, dark_shade ) + L'\n' );
}
Related
i want to get the value from an registry entry. Here is my code:
#include <atlbase.h>
#include <atlstr.h>
#include <iostream>
#define BUFFER 8192
int main()
{
char value[255];
DWORD BufferSize = BUFFER;
RegGetValue(HKEY_LOCAL_MACHINE, L"SYSTEM\\CurrentControlSet\\Control\\ComputerName\\ActiveComputerName", L"ComputerName", RRF_RT_REG_SZ, NULL, (PVOID)&value, &BufferSize);
std::cout << value << std::endl;
}
My Computer name is: DESKTOP-IGW3F.
But if i run my program my output is: D
I have no idea how to fix it...i hope you can help me.
The Win32 function RegGetValue() does not exist. It is only a preprocessor macro that will resolve to either RegGetValueA() or RegGetValueW() depending on your project settings. In your case, the macro resolves to RegGetValueW(), therefore it treats the registry value as a Unicode string (2 bytes per character). But you are using a char (1 byte per character) buffer to receive the Unicode data.
To make your code work, you need to either explicitly call RegGetValueA() instead, or change your buffer type from char to wchar_t. Either way, you should also check the return value of the function.
A working example could look like this:
#include <windows.h>
#include <iostream>
int main()
{
WCHAR value[255];
DWORD bufferSize = 255 * sizeof(WCHAR);
if (!RegGetValueW(HKEY_LOCAL_MACHINE, L"SYSTEM\\CurrentControlSet\\Control\\ComputerName\\ActiveComputerName", L"ComputerName", RRF_RT_REG_SZ, NULL, value, &bufferSize))
{
std::wcout << value << std::endl;
}
}
I ask a code snippet which cin a unicode text, concatenates another unicode one to the first unicode text and the cout the result.
P.S. This code will help me to solve another bigger problem with unicode. But before the key thing is to accomplish what I ask.
ADDED: BTW I can't write in the command line any unicode symbol when I run the executable file. How I should do that?
I had a similar problem in the past, in my case imbue and sync_with_stdio did the trick. Try this:
#include <iostream>
#include <locale>
#include <string>
using namespace std;
int main() {
ios_base::sync_with_stdio(false);
wcin.imbue(locale("en_US.UTF-8"));
wcout.imbue(locale("en_US.UTF-8"));
wstring s;
wstring t(L" la Polynésie française");
wcin >> s;
wcout << s << t << endl;
return 0;
}
Depending on what type unicode you mean. I assume you mean you are just working with std::wstring though. In that case use std::wcin and std::wcout.
For conversion between encodings you can use your OS functions like for Win32: WideCharToMultiByte, MultiByteToWideChar or you can use a library like libiconv
Here is an example that shows four different methods, of which only the third (C conio) and the fourth (native Windows API) work (but only if stdin/stdout aren't redirected). Note that you still need a font that contains the character you want to show (Lucida Console supports at least Greek and Cyrillic). Note that everything here is completely non-portable, there is just no portable way to input/output Unicode strings on the terminal.
#ifndef UNICODE
#define UNICODE
#endif
#ifndef _UNICODE
#define _UNICODE
#endif
#define STRICT
#define NOMINMAX
#define WIN32_LEAN_AND_MEAN
#include <iostream>
#include <string>
#include <cstdlib>
#include <cstdio>
#include <conio.h>
#include <windows.h>
void testIostream();
void testStdio();
void testConio();
void testWindows();
int wmain() {
testIostream();
testStdio();
testConio();
testWindows();
std::system("pause");
}
void testIostream() {
std::wstring first, second;
std::getline(std::wcin, first);
if (!std::wcin.good()) return;
std::getline(std::wcin, second);
if (!std::wcin.good()) return;
std::wcout << first << second << std::endl;
}
void testStdio() {
wchar_t buffer[0x1000];
if (!_getws_s(buffer)) return;
const std::wstring first = buffer;
if (!_getws_s(buffer)) return;
const std::wstring second = buffer;
const std::wstring result = first + second;
_putws(result.c_str());
}
void testConio() {
wchar_t buffer[0x1000];
std::size_t numRead = 0;
if (_cgetws_s(buffer, &numRead)) return;
const std::wstring first(buffer, numRead);
if (_cgetws_s(buffer, &numRead)) return;
const std::wstring second(buffer, numRead);
const std::wstring result = first + second + L'\n';
_cputws(result.c_str());
}
void testWindows() {
const HANDLE stdIn = GetStdHandle(STD_INPUT_HANDLE);
WCHAR buffer[0x1000];
DWORD numRead = 0;
if (!ReadConsoleW(stdIn, buffer, sizeof buffer, &numRead, NULL)) return;
const std::wstring first(buffer, numRead - 2);
if (!ReadConsoleW(stdIn, buffer, sizeof buffer, &numRead, NULL)) return;
const std::wstring second(buffer, numRead);
const std::wstring result = first + second;
const HANDLE stdOut = GetStdHandle(STD_OUTPUT_HANDLE);
DWORD numWritten = 0;
WriteConsoleW(stdOut, result.c_str(), result.size(), &numWritten, NULL);
}
Edit 1: I've added a method based on conio.
Edit 2: I've messed around with _O_U16TEXT a bit as described in Michael Kaplan's blog, but that seemingly only had wgets interpret the (8-bit) data from ReadFile as UTF-16. I'll investigate this a bit further during the weekend.
If you have actual text (i.e., a string of logical characters), then insert to the wide streams instead. The wide streams will automatically encode your characters to match the bits expected by the locale encoding. (And if you have encoded bits instead, the streams will decode the bits, then re-encode them to match the locale.)
There is a lesser solution if you KNOW you have UTF-encoded bits (i.e., an array of bits intended to be decoded into a string of logical characters) AND you KNOW the target of the output stream is expecting that very same bit-format, then you can skip the decoding and re-encoding steps and write() the bits as-is. This only works when you know both sides use the same encoding format, which may be the case for small utilities not intended to communicate with processes in other locales.
It depends on the OS. If your OS understands you can simply send it UTF-8 sequences.
This question already has answers here:
How do I properly compare strings in C?
(10 answers)
Closed 7 years ago.
I'm learning win32 programming with C/C++. In the process of learning, my teacher wanted I write a simple program that it can shows the name of the computer which It runs on it and Then, if the name of the target computer was "USER", shows a warning in the output console. I written the following code, but It doesn't work.
myFunction Code :
tchar * getComputerName() {
bufCharCount = INFO_BUFFER_SIZE;
if (!GetComputerName(infoBuf, &bufCharCount))
printError(TEXT("GetComputerName"));
return (TCHAR*)infoBuf;
}
calling code :
if (getComputerName() == (TCHAR*)"USER") {
printf("Target OS Detected \n");
}
how can i fix this issue?
There are several issues with the code as posted. The most blatant one is the use of TCHARs. TCHAR was invented, before Win9x had Unicode support, in an attempt to keep code source code compatible between Win9x and Windows NT (the latter uses Unicode with UTF-16LE throughout). Today, there is no reason to use TCHARs at all. Simply use wchar_t and the Windows API calls with a W suffix.
The C-style casts (e.g. return (TCHAR*)infoBuf) are another error waiting to happen. If the code doesn't compile without a cast in this case, this means, you are using incompatible types (char vs. wchar_t).
Plus, there's a logical error: When using C-style strings (represented through pointers to a sequence of zero-terminated characters), you cannot use operator== to compare the string contents. It will compare the pointers instead. The solution to this is to either invoke an explicit string comparison (strcmp), or use a C++ string instead. The latter overloads operator== to perform a case-sensitive string compare.
A fixed version might look like this:
#include <windows.h>
#include <string>
#include <iostream>
#include <stdexcept>
std::wstring getComputerName() {
wchar_t buffer[MAX_COMPUTERNAME_LENGTH + 1] = {0};
DWORD cchBufferSize = sizeof(buffer) / sizeof(buffer[0]);
if (!GetComputerNameW(buffer, &cchBufferSize))
throw std::runtime_error("GetComputerName() failed.");
return std::wstring(&buffer[0]);
}
int main() {
const std::wstring compName = getComputerName();
if ( compName == L"USER" ) {
std::wcout << L"Print your message" << std::endl;
}
}
The following code works for me:
#include <windows.h>
// ...
std::string get_computer_name()
{
const int buffer_size = MAX_COMPUTERNAME_LENGTH + 1;
char buffer[buffer_size];
DWORD lpnSize = buffer_size;
if (GetComputerNameA(buffer, &lpnSize) == FALSE)
throw std::runtime_error("Something went wrong.");
return std::string{ buffer };
}
You can't compare two string pointers to compare the string.
DWORD dw_computer_name = MAX_COMPUTERNAME_LENGTH;
TCHAR computer_name[MAX_COMPUTERNAME_LENGTH+1];
if ( 0 != GetComputerName( computer_name, &dw_computer_name ) )
{
printError( TEXT( "GetComputerName" ) );
if ( 0 == _tcscmp( computer_name, _T("HOST") )
{
printf( "Target OS Detected \n" );
}
}
I got some local language font installed in my system (windows 8 OS). Through character map tool in windows, i got to know the unicode for those characters for that particular font.
All i wanted to print those character in command line through a C program.
For example: Assume greek letter alpha is represented with unicode u+0074.
Taking "u+0074" as an input, i would like my C program to output alpha character
Can anyone help me?
There are several issues. If you're running in a console window, I'd convert the code to UTF-8, and set the code page for the window to 65001. Alternatively, you can use wchar_t (which is UTF-16 on Windows), output via std::wostream and set the code page to 1200. (According the the documentation I've found, at least. I've no experience with this, because my code has had to be portable, and on the other platforms I've worked on, wchar_t has been either some private 32 bit encoding, or UTF-32.)
First you should set TrueType font (Consolas) in console's Properties. Then this code should suffice in your case -
#include <stdio.h>
#include <tchar.h>
#include <iostream>
#include <string>
#include <Windows.h>
#include <fstream>
//for _setmode()
#include <io.h>
#include <fcntl.h>
using namespace std;
int _tmain(int argc, _TCHAR* argv[])
{
TCHAR tch[1];
tch[0] = 0x03B1;
// Test1 - WriteConsole
HANDLE hStdOut = GetStdHandle(STD_OUTPUT_HANDLE);
if (hStdOut == INVALID_HANDLE_VALUE) return 1;
DWORD dwBytesWritten;
WriteConsole(hStdOut, tch, (DWORD)_tcslen(tch), &dwBytesWritten, NULL);
WriteConsole(hStdOut, L"\n", 1, &dwBytesWritten, NULL);
_setmode(_fileno(stdout), _O_U16TEXT);
// Test2 - wprintf
_tprintf_s(_T("%s\n"),tch);
// Test3 - wcout
wcout << tch << endl;
wprintf(L"\x03B1\n");
if (wcout.bad())
{
_tprintf_s(_T("\nError in wcout\n"));
return 1;
}
return 0;
}
MSDN -
setmode is typically used to modify the default translation mode of
stdin and stdout, but you can use it on any file. If you apply
_setmode to the file descriptor for a stream, call _setmode before performing any input or output operations on the stream.
use the Unicode version of the WriteConsole function.
also, be sure to store the source code as UTF-8 with BOM, which is supported by both g++ and visual c++
Example, assuming that you want to present a greek alpha given its Unicode code in the form "u+03B1" (the code you listed stands for a lowercase "t"):
#include <stdlib.h> // exit, EXIT_FAILURE, wcstol
#include <string> // std::wstring
using namespace std;
#undef UNICODE
#define UNICODE
#include <windows.h>
bool error( char const s[] )
{
::FatalAppExitA( 0, s );
exit( EXIT_FAILURE );
}
namespace stream_handle {
HANDLE const output = ::GetStdHandle( STD_OUTPUT_HANDLE );
} // namespace stream_handle
void write( wchar_t const* const s, int const n )
{
DWORD n_chars_written;
::WriteConsole(
stream_handle::output,
s,
n,
&n_chars_written,
nullptr // overlapped i/o structure
)
|| error( "WriteConsole failed" );
}
int main()
{
wchar_t const input[] = L"u+03B1";
wchar_t const ch = wcstol( input + 2, nullptr, 16 );
wstring const s = wstring() + ch + L"\r\n";
write( s.c_str(), s.length() );
}
In C there is the primitive type of wchar_t which defines a wide-character. There are also corresponding functions like strcat -> wstrcat. Of course it depends on the environment you are using. If you use Visual Studio have a look here.
I ask a code snippet which cin a unicode text, concatenates another unicode one to the first unicode text and the cout the result.
P.S. This code will help me to solve another bigger problem with unicode. But before the key thing is to accomplish what I ask.
ADDED: BTW I can't write in the command line any unicode symbol when I run the executable file. How I should do that?
I had a similar problem in the past, in my case imbue and sync_with_stdio did the trick. Try this:
#include <iostream>
#include <locale>
#include <string>
using namespace std;
int main() {
ios_base::sync_with_stdio(false);
wcin.imbue(locale("en_US.UTF-8"));
wcout.imbue(locale("en_US.UTF-8"));
wstring s;
wstring t(L" la Polynésie française");
wcin >> s;
wcout << s << t << endl;
return 0;
}
Depending on what type unicode you mean. I assume you mean you are just working with std::wstring though. In that case use std::wcin and std::wcout.
For conversion between encodings you can use your OS functions like for Win32: WideCharToMultiByte, MultiByteToWideChar or you can use a library like libiconv
Here is an example that shows four different methods, of which only the third (C conio) and the fourth (native Windows API) work (but only if stdin/stdout aren't redirected). Note that you still need a font that contains the character you want to show (Lucida Console supports at least Greek and Cyrillic). Note that everything here is completely non-portable, there is just no portable way to input/output Unicode strings on the terminal.
#ifndef UNICODE
#define UNICODE
#endif
#ifndef _UNICODE
#define _UNICODE
#endif
#define STRICT
#define NOMINMAX
#define WIN32_LEAN_AND_MEAN
#include <iostream>
#include <string>
#include <cstdlib>
#include <cstdio>
#include <conio.h>
#include <windows.h>
void testIostream();
void testStdio();
void testConio();
void testWindows();
int wmain() {
testIostream();
testStdio();
testConio();
testWindows();
std::system("pause");
}
void testIostream() {
std::wstring first, second;
std::getline(std::wcin, first);
if (!std::wcin.good()) return;
std::getline(std::wcin, second);
if (!std::wcin.good()) return;
std::wcout << first << second << std::endl;
}
void testStdio() {
wchar_t buffer[0x1000];
if (!_getws_s(buffer)) return;
const std::wstring first = buffer;
if (!_getws_s(buffer)) return;
const std::wstring second = buffer;
const std::wstring result = first + second;
_putws(result.c_str());
}
void testConio() {
wchar_t buffer[0x1000];
std::size_t numRead = 0;
if (_cgetws_s(buffer, &numRead)) return;
const std::wstring first(buffer, numRead);
if (_cgetws_s(buffer, &numRead)) return;
const std::wstring second(buffer, numRead);
const std::wstring result = first + second + L'\n';
_cputws(result.c_str());
}
void testWindows() {
const HANDLE stdIn = GetStdHandle(STD_INPUT_HANDLE);
WCHAR buffer[0x1000];
DWORD numRead = 0;
if (!ReadConsoleW(stdIn, buffer, sizeof buffer, &numRead, NULL)) return;
const std::wstring first(buffer, numRead - 2);
if (!ReadConsoleW(stdIn, buffer, sizeof buffer, &numRead, NULL)) return;
const std::wstring second(buffer, numRead);
const std::wstring result = first + second;
const HANDLE stdOut = GetStdHandle(STD_OUTPUT_HANDLE);
DWORD numWritten = 0;
WriteConsoleW(stdOut, result.c_str(), result.size(), &numWritten, NULL);
}
Edit 1: I've added a method based on conio.
Edit 2: I've messed around with _O_U16TEXT a bit as described in Michael Kaplan's blog, but that seemingly only had wgets interpret the (8-bit) data from ReadFile as UTF-16. I'll investigate this a bit further during the weekend.
If you have actual text (i.e., a string of logical characters), then insert to the wide streams instead. The wide streams will automatically encode your characters to match the bits expected by the locale encoding. (And if you have encoded bits instead, the streams will decode the bits, then re-encode them to match the locale.)
There is a lesser solution if you KNOW you have UTF-encoded bits (i.e., an array of bits intended to be decoded into a string of logical characters) AND you KNOW the target of the output stream is expecting that very same bit-format, then you can skip the decoding and re-encoding steps and write() the bits as-is. This only works when you know both sides use the same encoding format, which may be the case for small utilities not intended to communicate with processes in other locales.
It depends on the OS. If your OS understands you can simply send it UTF-8 sequences.