I would like to know how to change the console title in C++ using a string as the new parameter.
I know you can use the SetConsoleTitle function of the Win32 API but that does not take a string parameter.
I need this because I am doing a Java native interface project with console effects and commands.
I am using windows and it only has to be compatible with Windows.
The SetConsoleTitle function does indeed take a string argument. It's just that the kind of string depends on the use of UNICODE or not.
You have to use e.g. the T macro to make sure the literal string is of the correct format (wide character or single byte):
SetConsoleTitle(T("Some title"));
If you are using e.g. std::string things get a little more complicated, as you might have to convert between std::string and std::wstring depending on the UNICODE macro.
One way of not having to do that conversion is to always use only std::string if UNICODE is not defined, or only std::wstring if it is defined. This can be done by adding a typedef in the "stdafx.h" header file:
#ifdef UNICODE
typedef std::wstring tstring;
#else
typedef std::string tstring;
#endif
If your problem is that SetConsoleTitle doesn't take a std::string (or std::wstring) it's because it has to be compatible with C programs which doesn't have the string classes (or classes at all). In that case you use the c_str of the string classes to get a pointer to the string to be used with function that require old-style C strings:
tstring title = T("Some title");
SetConsoleTitle(title.c_str());
There's also another solution, and that is to use the explicit narrow-character "ASCII" version of the function, which have an A suffix:
SetConsoleTitleA("Some title");
There's of course also a wide-character variant, with a W suffix:
SetConsoleTitleW(L"Some title");
string str(L"Console title");
SetConsoleTitle(str.c_str());
The comment is old but you can do it with the system method...
#include <iostream>
int main(){
system("title This is a title");
}
Related
I'm working for crossplatrofm project in c++ and I have variable with type std::string and need convert it to const TCHAR * - what is proper way, may be functions from some library ?
UPD 1: - as I see in function definition there is split windows and non-Windows implementations:
#if defined _MSC_VER || defined __MINGW32__
#define _tinydir_char_t TCHAR
#else
#define _tinydir_char_t char
#endif
- so is it a really no way for non spliting realization for send parameter from std::string ?
Proper way crossplatfom convert from std::string to 'const TCHAR *'
TCHAR should not be used in cross platform programs at all; Except of course, when interacting with windows API calls, but those need to be abstracted away from the rest of the program or else it won't be cross-platform. So, you only need to convert between TCHAR strings and char strings in windows specific code.
The rest of the program should use char, and preferably assume that it contains UTF-8 encoded strings. If user input, or system calls return strings that are in a different encoding, you need to figure out what that encoding is, and convert accordingly.
Character encoding conversion functionality of the C++ standard library is rather weak, so that is not of much use. You can implement the conversion according the encoding specification or you can use a third party implementation, as always.
may be functions from some library ?
I recommend this.
as I see in function definition there is split windows and non-Windows implementations
The library that you use doesn't provide a uniform API to different platforms, so it cannot be used in a truly cross-platform way. You can write a wrapper library with uniform function declarations that handles the character encoding conversion on platforms that need it.
Or, you can use another library, which provides a uniform API and converts the encoding transparently.
TCHAR are Windows type and it defined in this way:
#ifdef UNICODE
typedef wchar_t TCHAR, *PTCHAR;
#else
typedef char TCHAR, *PTCHAR;
#endif
UNICODE macro is typically defined in project settings (in case when your use Visual Studio project on Windows).
You can get the const TCHAR* from std::string (which is ASCII or UTF8 in most cases) in this way:
std::string s("hello world");
const TCHAR* pstring = nullptr;
#ifdef UNICODE
std::wstring_convert<std::codecvt_utf8_utf16<wchar_t>> converter;
std::wstring wstr = converter.from_bytes(s);
pstring = wstr.data();
#else
pstring = s.data();
#endif
pstring will be the result.
But it's highly not recommended to use the TCHAR on other platforms. It's better to use the UTF8 strings (char*) within std::string
I came across boost.nowide the other day. I think it will do exactly what you want.
http://cppcms.com/files/nowide/html/
As others have pointed out, you should not be using TCHAR except in code that interfaces with the Windows API (or libraries modeled after the Windows API).
Another alternative is to use the character conversion classes/macros defined in atlconv.h. CA2T will convert an 8-bit character string to a TCHAR string. CA2CT will convert to a const TCHAR string (LPCTSTR). Assuming your 8-bit strings are UTF-8, you should specify CP_UTF8 as the code page for the conversion.
If you want to declare a variable containing a TCHAR copy of a std::string:
CA2T tstr(stdstr.c_str(), CP_UTF8);
If you want to call a function that takes an LPCTSTR:
FunctionThatTakesString(CA2CT(stdsr.c_str(), CP_UTF8));
If you want to construct a std::string from a TCHAR string:
std::string mystdstring(CT2CA(tstr, CP_UTF8));
If you want to call a function that takes an LPTSTR then maybe you should not be using these conversion classes. (But you can if you know that the function you are calling does not modify the string outside its current length.)
I have been playing with writing Unicode code using Windows API and have found one thing particularly frustrating.
I made a simple interface for wrapping "MessageBox" into an "alert", reducing number of arguments needed to call it from 4 to 1.
The issue is that this one argument MUST be called with L"My string", which I want to avoid.
Here is my interface so far:
UseCase.hpp
#pragma once
#include <string>
#include <iostream>
/**
* The User should not need to know any Windows API ideally.
* They will have to know L"Str" notation though...
*/
namespace UseCase
{
void alert(const wchar_t *message);
};
UseCase.cpp
#include <Windows.h>
#include "UseCase.hpp"
/**
* Kind-of like javascript:alert(message).
* This is a very common usecase, and heavily simplifies call convention.
*/
void UseCase::alert(const wchar_t *message)
{
MessageBox(0, message, L"message box", MB_OK | MB_ICONEXCLAMATION);
}
main.cpp
#include "UseCase.hpp"
using namespace UseCase;
const wchar_t *msg = L"Привет мир";
int wmain(int argc, wchar_t **argv)
{
alert(msg);
return 0;
}
My concern is that no matter how the main tries to call alert, it has to use L"String" notation which is visually annoying to me.
To elaborate why L is annoying to me, it is for two reasons:
It treats Rvalue string literals differently from variables containing an ascii string.
If you try to call it on an ascii string "hello, world", it will give a confusing error message to an average user.
It's obvious that the string cannot be stored in ascii, and you are trying to assign it to a unicode string, there shouldn't be much getting in a way of an automatic conversion.
Is there a way to get rid of the L"String" convention and make the program automatically treat the function input as wide character input?
Current viable solutions are:
Create a macro for calling messagebox with a string literal that wraps it in L for you. The issue with this approach is the cryptic error messages on variables passed into the function, as L cannot convert variables from ascii to unicode for you.
Others proposed but I haven't fully wrapped my head around implementing them.
Have the user register their string externally and stored as unicode
inside a file/binary cache. I can then provide an interface to load this string as a wide string and call the function using it.
If you want to be able to pass Unicode strings (i.e. use MessageBoxW, which takes wide-character strings), but you want to be able to use plain string literals without the L prefix, you'll need to decide on an encoding to use for the Unicode characters in a narrow string, and then perform a run-time conversion to a wide string according to the encoding.
UTF-8 might be a reasonable start. See this answer for how to convert from/to UTF-8 using standard Windows API.
You can get around this by making the actual call to alert a preprocessor macro, and using the _TEXT() macro provided by MS (or figure out how to do token pasting L ## msg yourself correctly.)
#define DMITRY_ALERT(msg) UseCase::alert(_TEXT(msg))
this will "magically" insert the L in any UNICODE build at the cost of having to use a macro wrapper (that needs to have an UGLY_LONG_NAME to prevent clashes).
Note: That macro can then only be called with string literals, not with a variable.
Whether that is worth the trouble seems doubtful to me.
I have to work with an API that is using Microsoft's TCHAR macros and such, so I was wondering if I could use C++ in a way to simplify the task. So i was wondering if there is a way to support implicit conversion and why/why not std::string doesn't support converting from a smaller char size:
#include <Windows.h>
using String = std::basic_string<TCHAR>; // say TCHAR = wchar_t or equivalent
String someLiteralString = "my simple ansi string"; // Error here obviously
// some polymorphic class...
const TCHAR* MyOverriddenFunction() override { return someLiteralString.c_str(); }
// end some polymorphic class
The reason implicit conversion isn't supported is that conversion can be complicated. The simple case is when the string to convert is pure ASCII as in your example, but there's no way to guarantee that. The creators of the standard wisely stayed away from that problem.
If you don't know whether your strings are wide-character or not, you can use Microsoft's _T() macro around each string literal to generate the proper characters. But you say you don't want to do that.
Modern Windows programming always uses wide characters in the API. Chances are your program is too, otherwise the code you've shown would not cause an error. It's very unlikely that once you've used wide characters you'll switch back to narrow ones. A simple one-character change to your literals will make them wide-character to match the string type:
String someLiteralString = L"my simple ansi string";
Use the (ATL/MFC) CStringT class, it will make your life much easier.
http://msdn.microsoft.com/en-us/library/ms174284(v=vs.80).aspx
since I can get hands on the new RAD Studio Xe4 I thought I'd give it a try.
Unfortunatly, I am not so experienced with C++ and therefore I was wondering why the Code that works perfectly fine in VC++ doesn't work at all in C++ Builder.
Most of the problems are converting different var-types.
For example :
std::string Test = " ";
GetFileAttributes(Test.c_str());
works in VC++ but in C++ Builder it won't compile, telling me "E2034 Cannot convert 'const char *' to 'wchar_t *'.
Am I missing something? What is the reason that doesn't work the same on all compilers the same?
Thanks
Welcome to Windows Unicode/ASCII hell.
The function
GetFileAttributes
is actually a macro defined to either GetFileAttributesA or GetFileAttributesW depending on if you have _UNICODE (or was it UNICODE, or both?) defined when you include the Windows headers. The *A variants take char* and related arguments, the *W functions take wchar_t* and related arguments.
I suggest calling only the wide *W variants directly in new code. This would mean switching to std::wstring for Windows only code and some well-thought out design choices for a cross-platform application.
Your C++ Builder config is set to use UNICODE character set, which means that Win32 APIs are resolved to their wide character versions. Therefore you need to use wide char strings in your C++ code. If you would set your VS config to use UNICODE, you would get the same error.
You can try this:
// wstring = basic_string<wchar_t>
// _T macro ensures that the specified literal is a wide char literal
std::wstring Test = _T(" ");
GetFileAttributes(Test.c_str()); // c_str now returns const wchar_t*, not const char*
See more details about _T/_TEXT macros here: http://docwiki.embarcadero.com/RADStudio/XE3/en/TCHAR_Mapping
You have defined _UNICODE and/or UNICODE in Builder and not defined it in VC.
Most Windows APIs come in 2 flavours the ANSI flavour and the UNICODE flavour.
For, when you call SetWindowText, there really is no SetWindowText functions. Instead there are 2 different functions
- SetWindowTextA which takes an ANSI string
and
- SetWindowTextW which takes a UNICODE string.
If your program is compiled with /DUNICODE /D_UNICODE, SetWindowText maps to SetWindowTextWwhich expects aconst wchar_t *`.
If your program is compiled without these macros defined, it maps to SetWindowTextA which takes a const char *.
The windows headers typically do something like this to make this happen.
#ifdef UNICODE
#define SetWindowText SetWindowTextW
#else
#define SetWindowText SetWindowTextA
#endif
Likewise, there are 2 GetFileAttributes.
DWORD WINAPI GetFileAttributesA(LPCSTR lpFileName);
DWORD WINAPI GetFileAttributesW(LPCWSTR lpFileName);
In VC, you haven't defined UNICODE/_UNICODE & hence you are able to pass string::c_str() which returns a char *.
In Builder, you probably have defined UNICODE/_UNICODE & it expects a wchar_t *.
You may not have done this UNICODE/_UNICODE thing explicitly - may be the IDE is doing it for you - so check the options in the IDE.
You have many ways of fixing this
find the UNICODE/_UNICODE option in the IDE and disable it.
or
use std::w_string - then c_str() will return a wchar_t *
or
Call GetFileAttributesA directly instead of GetFileAttributes - you will need to do this for every other Windows API which comes with these 2 variants.
I want to make the hostname part of this string to be variable..
Currently, it is only fix to this URL:
_T(" --url=http://www.myurl.com/ --out=c:\\current.png");
I want to make something like this, so the URL is changeable..
_T(" --url=http://www." + myurl + "/ --out=c:\\current.png");
update. Below is my latest attempt:
CString one = _T(" --url=http://www.");
CString two(url->bstrVal);
CString three = _T("/ --out=c:\\current.png");
CString full = one + two + three;
ShellExecute(0,
_T("open"), // Operation to perform
_T("c:\\IECapt"), // Application name
_T(full),// Additional parameters
0, // Default directory
SW_HIDE);
The error is : Error 1 error C2065: 'Lfull' : undeclared identifier c:\test.cpp
It doesn't work because the _T() macro works only with constant string literals. The definition for _T() looks something like this:
#ifdef UNICODE
#define _T(str) L##str
#else
#define _T(str) str
Since you're apparently compiling in Unicode mode, _T(full) expands to Lfull, which is obviously not what you want.
In your case, just pass in full without the _T() macro since CString defines a conversion operator to a const wchar_t* in Unicode mode, and const char* in non-Unicode mode.
ShellExecute(0, _T("open"), _T("c:\\IECapt"), full, 0, SW_HIDE);
Note that standard C++ also provides a std::string type and a std::wstring type which does pretty much what CString does, so MFC isn't actually required for string manipulation. std::string does not provide a conversion operator, but does provide access to the underlying C-style string via c_str().
Your latest attempt will work if you lose _T() from around "full" in the call to ShellExecute. The CString will return the correct pointer. Job done.
In addition though... You should just lose the _T() stuff completely. If you're going to do things like reference url->bstrVal directly (which is Unicode, no matter what you're compiling for) then your code will only work as Unicode.
There's very little reason anymore to compile the same project for both Unicode and ANSI these days. The _T() stuff was created to address both modes "easily". But unless you're targeting Win95/98/ME, you can just go Unicode and clean up your code. Unicode is faster too, because the Windows API and the Kernel is Unicode internally. All ANSI APIs start by converting string parameters to Unicode first, and then calling their wide-char counterpart.
So no _T, TCHAR, etc. Use things like this instead:
PWSTR psz = L"my unicode string";
CString s = L"my other string, yay!";
In addition to what In silico said, CString::Format makes this code a lot more readable:
CString full;
full.Format(_T(" --url=http://www.%s/ --out=c:\\current.png"), url->bstrVal));