I am stuck about searching for a Chinese program name while using FindWindowW(NULL, "program name") function.
When I searched for English, it works perfectly.
Can someone give me a clue about how to search using a unicode?
I couldn't figure out yet, can someone guide me how to do?
#include <windows.h>
#include <stdio.h>
int main(){
HWND hWnd = FindWindowW(NULL,L"\uAA5A\uAA4C\uB873\uAB4C\uB6C7");
if(NULL == hWnd){
printf("NotFound!");
}else {
printf("Found!");
}
}
Use the Unicode (wide) version of FindWindow and use wide strings for the search. I also recommend saving the source in UTF-8 encoding and using the /utf-8 compiler switch for the Microsoft compiler; otherwise, the compiler will assume a localized ANSI encoding to interpret the wide string. That's fine if you're localized encoding is a Chinese-variant, but if you're on a US or Western European version of Windows the Microsoft IDE will probably prompt you to save in UTF-16 if you use Chinese characters in string constants:
Example:
#include <windows.h>
#include <stdio.h>
int main(void)
{
//HWND h = FindWindowW(NULL,L"马克"); // works if saved in UTF-8 encoding
// // and compiled with /utf-8.
HWND h = FindWindowW(NULL,L"\u9a6c\u514b");
if(h == NULL)
printf("err = %ld\n",GetLastError());
else
printf("handle = %p\n",h);
}
On Windows I changed the terminal window to a matching Chinese title with title 马克 and this code found the window:
C:\>title 马克
C:\>test
handle = 00000000000B0258
C:\>test
handle = 00000000000B0258
Microsoft's Spy++ tool confirms the handle:
Related
I'm trying to get the infinity symbol (∞) to print but I keeping getting garbage. I've tried everything mentioned here but nothing is working.
What I'm trying to accomplish is this
modifies strength by 9 ∞
I've tried
printf ("%c", 236);
printf ("%c", 236u);
and I get
modifies strength by 9 ì
I've tried
printf("∞");
and I get
modifies strength by 9 ?
I tried this
if ( paf->duration == -1 ){
setlocale(LC_ALL, "en_US.UTF-8");
wprintf(L"%lc\n", 8734);
ch->printf("∞");
Just to see if I could get wprintf to print it but it completely ignores setlocale and wprintf and still gives me
modifies strength by 9 ?
I tried
if ( paf->duration == -1 ){
std::cout << "\u221E";
ch->printf("∞");
But got the this warning and error
Error C2664 'int _CrtDbgReportW(int,const wchar_t *,int,const wchar_t *,const wchar_t *,...)': cannot convert argument 5 from 'int' to 'const wchar_t *' testROS1a C:\Program Files (x86)\Windows Kits\10\Include\10.0.19041.0\ucrt\malloc.h 164
Warning C4566 character represented by universal-character-name '\u221E' cannot be represented in the current code page (1252) testROS1a C:\_Reign of Shadow\TEST\src\CPP\act_info.cpp 3724
which I can't make heads or tails of. I've exhausted the scope of my knowledge so does anyone know how to make this happen?
This does the trick on my machine, windows + code page (1252), not sure how universal this is though. I never really got to work a lot with unicode/localization stuff. And there always seems to be one more gotcha.
#include <iostream>
#include <io.h>
#include <fcntl.h>
const wchar_t infinity_symbol = 0x221E;
int main()
{
// enable windows console to unicode
_setmode(_fileno(stdout), _O_U16TEXT);
std::wcout << infinity_symbol;
}
To use the Windows command prompt with wide strings, change the mode as follows, printf/cout won't work until the mode is switched back. Make sure to flush between mode changes:
#include <iostream>
#include <io.h>
#include <fcntl.h>
using namespace std;
int main()
{
// To use wprintf/wcout and output any BMP (<= U+FFFF) code point
int org = _setmode(_fileno(stdout), _O_U16TEXT);
wcout << L'\u221e' << endl;
wprintf(L"\u221e\n");
fflush(stdout);
_setmode(_fileno(stdout), org); // to switch back to cout/printf and default code page
cout << "hello, world!" << endl;
printf("hello, world!\n");
}
Output:
∞
∞
hello, world!
hello, world!
If you use UTF-8 source and your compiler accepts it, you can also change the code page of the terminal to 65001 (UTF-8) and it could work with printf as is:
test.c
#include <stdio.h>
int main() {
printf("∞\n");
}
Console output:
C:\demo>cl /W4 /utf-8 /nologo test.c
test.c
C:\demo>chcp
Active code page: 437
C:\demo>test ## NOTE: Wrong code page prints mojibake.
∞ ## These are UTF-8 bytes interpreted incorrectly.
C:\demo>chcp 65001
Active code page: 65001
C:\demo>test
∞
The best option in Windows is to use UTF16 with _setmode as shown in other answers. You can also _setmode to switch back and forth between Unicode and ANSI.
warning C4566: character represented by universal-character-name
'\u221E' cannot be represented in the current code page (1252)
That's because your *.cpp file is probably saved in Unicode, the compiler sees ∞ as 2-byte UTF16 wchar_t ('\u221E') and can't convert it to char.
printf("\xEC") might print ∞, but only on English systems, and only if the console font feels like interpreting it that way. Even if it worked, it may stop working tomorrow if you change some small settings. This is the old ANSI encoding which many problems, that's why Unicode was brought in.
You can use this alternate solution in Visual Studio with C++20:
SetConsoleOutputCP(CP_UTF8);
printf((const char*)u8"∞");
And this solution in Visual Studio and C++17:
SetConsoleOutputCP(CP_UTF8);
printf(u8"∞");
And who knows how it's going to change later.
But UTF16 is at least natively supported. wprintf, wcout with _setmode are more consistent.
I am looking to change the Windows desktop background wallpaper in C++ using the Windows API.
I have read the following posts on this topic:
How to change desktop background using VC++
SystemParametersInfo sets wallpaper completly black (using SPI_SETDESKWALLPAPER)
Problem:
When I execute the code, the desktop background changes to completely black like in the post above (yes, I did try the suggested fix in that post. No luck.)
Code:
#include <windows.h>
int main() {
std::string s = "C:\\picture.jpg";
SystemParametersInfo(SPI_SETDESKWALLPAPER, 0, (PVOID*)s.c_str(), SPIF_SENDCHANGE);
return 0;
}
I have also tried just (void*) instead of (PVOID*) above and an L in front of the string. Nothing works so far.
SOLVED:
Changing SystemParametersInfo to SystemParametersInfoA (as suggested in the comment and answer) did the trick.
I believe you should use a wchar_t as input for SystemParametersInfo() instead of a string and also use SystemParametersInfoW().
The following code worked for me:
#include <windows.h>
#include <iostream>
int main() {
const wchar_t *path = L"C:\\image.png";
int result;
result = SystemParametersInfoW(SPI_SETDESKWALLPAPER, 0, (void *)path, SPIF_UPDATEINIFILE);
std::cout << result;
return 0;
}
Where result should return true if it manages to change the background.
Unlike English and Latin languages, Hebrew - in it's awesomeness, is written right to left. I wan't to get this message across to the computer realm.
I understand I have to use unicode/wchar for the actual string.
I understand I have to set the console to support unicode/wchar.
I understand I have to tell the console to use a supported font.
So I run this code:
#include "stdafx.h" // MS Visual Studio precompiled header file
#include <fcntl.h> // for _setmode
#include <io.h> // for _setmode
#include <windows.h> // for SetCurrentConsoleFontEx
int main()
{
// use unicode/wchar as the actual string.
wchar_t *hebrewString = L"עברית";
// set the console to support unicode/wchar.
_setmode(_fileno(stdout), _O_U16TEXT);
// tell the console to use a supported font.
CONSOLE_FONT_INFOEX info = { 0 };
info.cbSize = sizeof(info);
info.dwFontSize.Y = 20;
wcscpy_s(info.FaceName, L"Courier New");
SetCurrentConsoleFontEx(GetStdHandle(STD_OUTPUT_HANDLE), NULL, &info);
// print
wprintf(L"%s", hebrewString);
// wait for input, so that console won't disappear immediately
getchar();
// return
return 0;
}
This isn't bad, but the actual print is in reverse:
Is there a way to properly configure it so that it will print in the right order, or do I have to manually flip the string before passing it to the console?
I'm relatively new to WinAPI programming in C++. I'm trying to write a program that will obtain the system hostname using GetComputerName(). Ideally, I want the code to be able to work on English and non-English systems. Below is the code that I'm using:
int main()
{
wstring hostname;
wchar_t nbtName[MAX_COMPUTERNAME_LENGTH + 1];
DWORD length = MAX_COMPUTERNAME_LENGTH + 1;
GetComputerName(nbtName, &length);
hostname = nbtName;
wcout << hostname << endl;
return 0;
}
The code works fine on my English Windows 7 system, but the code doesn't seem to display properly on my German Windows 7 system (which uses German characters for the hostname). I thought that wstring and wchar_t could handle these special characters. Here's what's displayed on my German Windows 7 system.
COMPUTER-Í─▄▀
Am I overlooking something stupid? Thanks!
Use _setmode(_fileno(stdout), _O_U16TEXT) to show Unicode in console window:
#include <iostream>
#include <string>
#include <io.h> //for _setmode
#include <fcntl.h> //for _O_U16TEXT
int main()
{
_setmode(_fileno(stdout), _O_U16TEXT);
std::wcout << L"ελληνικά\n";
return 0;
}
Or use MessageBoxW(0, hostname.c_str(), 0, 0) or OutputDebugStringW to see Unicode text displayed correctly:
this function has two versions GetComputerNameW for unicode and GetComputerNameA for ANSI. if UNICODE is defined in your environment the the first one will be called. So either make sure UNICODE is defined or try to call the GetComputerNameW directly.
Windows Console output was the culprit. The Unicode characters display correctly in other non-console output. Thanks everyone!
I'm trying to create a file on Windows using a Chinese character. The entire path is inside the variable "std::string originalPath", however, I have a charset problem that I simply cannot understand to overcome.
I have written the following code:
#include <iostream>
#include <boost/locale.hpp>
#include <boost/filesystem/fstream.hpp>
#include <windows.h>
int main( int argc, char *argv[] )
{
// Start the rand
srand( time( NULL ) );
// Create and install global locale
std::locale::global( boost::locale::generator().generate( "" ) );
// Make boost.filesystem use it
boost::filesystem::path::imbue( std::locale() );
// Check if set to utf-8
if( std::use_facet<boost::locale::info>( std::locale() ).encoding() != "utf-8" ){
std::cerr << "Wrong encoding" << std::endl;
return -1;
}
std::string originalPath = "C:/test/s/一.png";
// Convert to wstring (**WRONG!**)
std::wstring newPath( originalPath.begin(), originalPath.end() );
LPWSTR lp=(LPWSTR )newPath.c_str();
CreateFileW(lp,GENERIC_READ | GENERIC_WRITE, FILE_SHARE_READ |
FILE_SHARE_WRITE, NULL,CREATE_ALWAYS,FILE_ATTRIBUTE_NORMAL,NULL );
return 0;
}
Running it, however, I get inside the folder "C:\test\s" a file of name "¦ᄌタ.png", instead of "一.png", which I want. The only way I found to overcome this is to exchange the lines
std::string originalPath = "C:/test/s/一.png";
// Convert to wstring (**WRONG!**)
std::wstring newPath( originalPath.begin(), originalPath.end() );
to simply
std::wstring newPath = L"C:/test/s/一.png";
In this case the file "一.png" appears perfectly inside the folder "C:\test\s". Nonetheless, I cannot do that because the software get its path from a std::string variable. I think the conversion from std::string to std::wstring is being performed the wrong way, however, as it can be seen, I'm having deep problem trying to understand this logic. I read and researched Google exhaustively, read many qualitative texts, but all my attempts seem to be useless. I tried the MultiByteToWideChar function and also boost::filesystem, but both for no help, I simply cannot get the right filename when written to the folder.
I'm still learning, so I'm very sorry if I'm making a dumb mistake. My IDE is Eclipse and it is set up to UTF-8.
You need to actually convert the UTF-8 string to UTF-16. For that you have to look up how to use boost::locale::conv or (on Windows only) the MultiByteToWideChar function.
std::wstring newPath( originalPath.begin(), originalPath.end() ); won't work, it will simply copy all the bytes one by one and cast them to a wchar_t.
Thank you for your help, roeland. Finally I managed to find a solution and I simply used this following library: "http://utfcpp.sourceforge.net/". I used the function "utf8::utf8to16" to convert my original UTF-8 string to UTF-16, this way allowing Windows to display the Chinese characters correctly.