This question already has answers here:
Argument of type "char *" is incompatible with parameter of type "LPWSTR"
(2 answers)
Closed 6 years ago.
The following code
string exePath() {
string path;
char buffer[MAX_PATH];
cout << "reading path\n";
GetModuleFileName(NULL, buffer, MAX_PATH);
string::size_type pos = string(buffer).find_last_of("\\/");
path = string(buffer).substr(0, pos)/*+"\\system.exe"*/;
return path;
}
gives me error at the second parameter in VisualStudio (buffer):
the type "char *" argument is incompatible with the type parameter
"LPWSTR"
(translated from italian,i have vs in italian, hope you understand)
and
can't convert Char to LPWSTR in the 2 argument
This code compiles fine with code::blocks and dev c++ but in vs it doesn't
Because GetModuleFileName is a macro definition for GetModuleFileNameW if UNICODE is defined and which requires wchar_t*. If no UNICODE is defined then GetModuleFileName resolves to GetModuleFileNameA. The simple fix for you is to use explicitly GetModuleFileNameA which accepts char*
This code compiles fine with code::blocks and dev c++ but in vs it doesn't
most probably because with code::blocks you compile with no UNICODE defined, while under VS by default UNICODE is defined.
If you are not using UNICODE, and want to stay with multibyte string (same as in code::blocks) you can disable UNICODE in your Visual Studio build by changing in project properties on General tab -> Character set to Use Multi-Byte Character Set.
[edit]
To avoid problems with UNICODE characters in the path it is recomended to use UNICODE. This requires you to use std::wstring which is specialization for wchar_t, also instead of char buffer[MAX_PATH]; you should use wchar_t buffer[MAX_PATH];. cout << "reading path\n"; would have to be wcout << L"reading path\n";. You can also use macros like TCHAR or _T("reading path"), to make it independently work for UNICODE and non-UNICODE builds.
I believe this is related How I can print the wchar_t values to console?. You must also prefix your strings with L to make them wide-char like this
std::wcout << L"reading path" << std::endl;
This will work well if your application is for Windows only, but you should use the TEXT("reading path") macro style otherwise, because even though it's not there by default for other platforms, you can add it easily.
Related
This question already has answers here:
How to convert std::string to LPCWSTR in C++ (Unicode)
(6 answers)
Closed 8 years ago.
I'm having a hard time converting a string into LPWSTR so I can use the PathStripToRoot() function.
Well for one the MSDN documation says I need LPTSTR variable (http://msdn.microsoft.com/en-us/library/windows/desktop/bb773757(v=vs.85).aspx), but Visual Studio 2013 says I need LPWSTR.
Here is a code snippet of my function:
fileStat fileCreate(const string& targetFile)
{
fileStat filez;
fstream file(targetFile.c_str());
if (!file)
{
cout << "File does not exist" << endl;
}
std::ifstream in(targetFile, ios::binary | ios::ate);
int a = in.tellg();
cout << "File size(bytes): " << in.tellg() << endl << endl;
file.close();
wstring stemp = strChange(targetFile);
LPCWSTR result = stemp.c_str();
/* Tried the below code but that did not work
LPWSTR ws = new wchar_t[targetFile.size() + 1];
copy(targetFile.begin(), targetFile.end(), ws);
ws[targetFile.size()] = 0;
*/
cout<<"\n\n"<<PathStripToRoot(ws)<<"\n\n";
...
filez.fileSize = a;
return filez;
}
A lot of people have said to use MultiByteToWideChar() function but I looked at the MSDN documation and have no idea how it works. Is there an easier way than using MultiByteToWideChar() ?
You may want to use Unicode UTF-16 strings in modern Windows applications when dealing with Win32 APIs: the std::wstring class (based on wchar_t) is OK for that with Visual C++.
Then, you can wrap the Win32 C API PathStripToRoot() in some C++ code, using convenient string classes instead of raw C-like string buffers.
Consider the following commented code as an example:
// Set Unicode mode
#define UNICODE
#define _UNICODE
// Windows SDK Headers
#include <Windows.h> // Win32 Platform SDK
#include <Shlwapi.h> // For PathStripToRoot()
#include <Strsafe.h> // For StringCchCopy()
// Standard C++ Headers
#include <exception> // For std::exception
#include <iostream> // For console output
#include <stdexcept> // For std::invalid_argument, std::runtime_error
#include <string> // For std::wstring
// For using PathStripToRoot()
#pragma comment(lib, "Shlwapi.lib")
// C++ wrapper around PathStripToRoot() Win32 API
std::wstring RootFromPath(const std::wstring& path)
{
// Buffer for PathStripToRoot()
wchar_t pathBuffer[MAX_PATH];
// Copy the input string into the buffer.
// Beware of buffer overruns!
HRESULT hr = ::StringCchCopy(pathBuffer, // dest
_countof(pathBuffer), // dest size
path.c_str()); // source
if (hr == STRSAFE_E_INSUFFICIENT_BUFFER)
{
// Copy failed due to insufficient buffer space.
// May accept this case or throw an exception
// based on the context...
// In this case, I just throw here.
throw std::invalid_argument("RootFromPath() - Path string too long.");
}
if (hr != S_OK)
{
throw std::runtime_error("RootFromPath() - StringCchCopy failed.");
}
// Call the Win32 C API using the raw C buffer
if (! ::PathStripToRoot(pathBuffer))
{
// No valid drive letter was found.
// Return an empty string
return std::wstring();
}
// Return a std::wstring with the buffer content
return std::wstring(pathBuffer);
}
// Test
int main()
{
try
{
const std::wstring path = L"C:\\Path1\\Path2";
const std::wstring root = RootFromPath(path);
std::wcout << "The content of the path before is:\t" << path << std::endl;
std::wcout << "RootFromPath() returned: \t" << root << std::endl;
}
catch(const std::exception& ex)
{
std::cerr << "\n*** ERROR: " << ex.what() << std::endl;
}
}
Compiled from command line:
C:\Temp\CppTests>cl /EHsc /W4 /nologo TestPathStripToRoot.cpp
Output:
C:\Temp\CppTests>TestPathStripToRoot.exe
The content of the path before is: C:\Path1\Path2
RootFromPath() returned: C:\
On that particular point of your question:
Well for one the MSDN documation says I need LPTSTR variable, but
Visual Studios says I need LPWSTR.
LPTSTR is a typedef equivalent to TCHAR*.
LPWSTR is a typedef equivalent to WCHAR*, i.e. wchar_t*.
TCHAR is a placeholder for a character type, that can be expanded to char or wchar_t, depending if you are in ANSI/MBCS or Unicode build mode.
Since VS2005, Visual Studio has been using Unicode builds as default.
So, unless you are maintaining an old legacy app that must use ANSI/MBCS, just use Unicode in modern Win32 applications. In this case, you can directly use wchar_t-based strings with Win32 APIs, without bothering with the old obsolete TCHAR-model.
Note that you can still have std::strings (which are char-based) in your code, e.g. to represent Unicode UTF-8 text. And you can convert between UTF-8 (char/std::string) and UTF-16 (wchar_t/std::wstring) at the Win32 API boundaries.
For that purpose, you can use some convenient RAII wrappers to raw Win32 MultiByteToWideChar() and WideCharToMultiByte() APIs.
The right way to think about building a Windows application is to pretend that 8-bit strings do not exist. Otherwise, the encoding of your string will vary based on the user's language settings, and your app will not be able "global ready" because there will always be some characters not representable by the user's current settings. 8-bit strings in Win32 are legacy from the 1990s and a good Win32 app uses PWSTR everywhere. Notice for instance that on Windows CE or WinRT the "A functions" don't even exist, that should give you some hint about how Microsoft feels about the issue.
Now, in practical terms, you may be interacting with non-Windows specific code that uses 8-bit strings. IMO the best approach to use for that is to say by convention that all such strings are UTF-8, and use MultiByteToWideChar and WideCharToMultiByte to convert to and from PWSTR. Be sure to use CP_UTF8. But for Windows specific code, please do define the UNICODE and _UNICODE macros, forget that TCHAR, TSTR, *A functions and other such accidents of history exist and use PWSTR and WCHAR everywhere. Your code will be saner for it.
You can use ATL conversion macros:
cout<<"\n\n"<<PathStripToRoot(CA2T(targetFile.c_str()))<<"\n\n";
If targetFile is ASCII string, use PathStripToRootA. Here you do not need any conversion, targetFile.c_str() will work.
If targetFile is UTF8 string, use MultiByteToWideChar to convert it to WideChar. Then use PathStripToRoot.
Otherwise make targetFile wstring, pass it to API without any conversion.
Please go through
Unicode Programming Summary
There are several ways to solve this.
The right approach is to enclose the string definitions with _T(). PathStripToRoot is defined as
#ifdef _UNICODE
#define PathStripToRoot PathStripToRootA
#else
#define PathStripToRoot PathStripToRootW
#endif
_T, and Windows APIs follows what you define the character set support you defined for your project under project settings. If you're taking file name as single byte string (string holds ANSI string) use PathStripToRootA. You can avoid converting file name in between to UNICODE string.
This has probably been asked before but I can't seem to find the solution:
std::string GetPath()
{
char buffer[MAX_PATH];
::GetSystemDirectory(buffer,MAX_PATH);
strcat(buffer,"\\version.dll");
return std::string(buffer);
}
This returns an error stating:
argument of type "char *" is incompatible with parameter of type "LPWSTR"
So yeah. Anyone got an answer?
You need to use the ansi version:
std::string GetPath()
{
char buffer[MAX_PATH] = {};
::GetSystemDirectoryA(buffer,_countof(buffer)); // notice the A
strcat(buffer,"\\version.dll");
return std::string(buffer);
}
Or use unicode:
std::wstring GetPath()
{
wchar_t buffer[MAX_PATH] = {};
::GetSystemDirectoryW(buffer,_countof(buffer)); // notice the W, or drop the W to get it "by default"
wcscat(buffer,L"\\version.dll");
return std::wstring(buffer);
}
Rather than call the A/W versions explicitly you can drop the A/W and configure the whole project to use ansi/unicode instead. All this will do is change some #defines to replace foo with fooA/W.
Notice that you should use _countof() to avoid incorrect sizes depending on the buffers type too.
If you compile your code using MultiByte support it will compile correctly,but when you compile it using Unicode flag it will give an error because in Unicode support ::GetSystemDirectoryA becomes ::GetSystemDirectoryW use consider using TCHAR instead of char.TCHAR is defined such that it becomes char in Multibyte flag and wchar_t with Unicode flag
TCHAR buffer[MAX_PATH];
::GetSystemDirectory(buffer,MAX_PATH);
_tcscat(buffer,_T("\\version.dll"));
You can use typedef for string /wstring so your code becomes independent
#ifdef UNICODE
typedef wstring STRING;
#else
typedef string STRING;
#endif
STRING GetPath()
{
TCHAR buffer[MAX_PATH];
::GetSystemDirectory(buffer,MAX_PATH);
_tcscat(buffer,_T("\\version.dll"));
return STRING(buffer);
}
I wrote code in VC++6.0 and imported it into VC++2005.
I get an ambiguous error with the unicode insertion now?
CString s;
s.Format("%f\r\n", (double)timebTime.time + (double)timebTime.millitm / 1000);
s+="RAMP,";
s+=0x00b5; // <-- Error: VC++(2005): "error C2593: 'operator +=' is ambiguous"
s+="m";
Note that VC++6.0's default compilation model is ANSI/MBCS (i.e. TCHAR is a char, CString is a sequence of char's, etc.), instead VC++2005's default compilation model is Unicode (i.e. TCHAR is wchar_t, CString is actually a CStringW, i.e. a wchar_t string).
I'd just use the Unicode model (don't bother with ANSI/MBCS compatibility and TCHAR, _T("..."), etc.), and adjust your code like this:
static const wchar_t microSign = 0x00B5;
CString s;
s.Format(L"%f\r\n",
static_cast<double>(timebTime.time) +
static_cast<double>(timebTime.millitm) / 1000.0);
s += L"RAMP,";
s += microSign;
s += L"m";
Note that the use of a named constant (like microSign) makes the code more readable than a "magic number" like 0x00B5.
Moreover, I'd store the format string (including the "RAMP,µm" part) in the app resources and load it from there instead of building it in the source code.
I have an issue with GetCurrentDirectory(), and i don't really understand why. The thing i don't understand is that it works for XP but not for Seven (or at least on my computer). There is my code:
char dir_name[1024]; // as a global variable
int get_files() {
// ...
DWORD dwRet;
dwRet = GetCurrentDirectory(MAX_PATH, dir_name);
printf("%s\n",dir_name);
printf("%d\n",dwRet);
//...
}
This code will return:
printf("%s\n",dir_name); -> return "c"
printf("%d\n",dwRet); -> 42 (which is the right length of the string that should be returned)
I don't understand why dir_name only takes the value "c".
I think, the result is Unicode in Windows Seven! and after each ascii character of this function there is zero. And you are printing it by printf. You should use wide-char functions in your program. Like wprintf.
Try below code: (Tested in Visual Studio 2008 + Windows 7)
#include <stdio.h>
#include <windows.h>
#include <wchar.h>
WCHAR dir_name[1024]; // as a global variable
int get_files()
{
// ...
DWORD dwRet;
dwRet = GetCurrentDirectory(MAX_PATH, dir_name);
wprintf(L"%s\n", dir_name);
printf("%d\n", dwRet);
//...
return 0;
}
Im not sure, but could it be GetCurrentDirectory() returns 2-byte chars under win7?
In such case you'll be getting a 0 in each second bytes of the char array returned.
So you should use a wide-char aware version of the printf() function such as wprintf().
Also I wonder whether the compiler wouldn't have warned you about something being wrong regarding types.
what compiler are you using? Under Visual C++ 2005, GetCurrentDirectory is a macro that resolves to GetCurrentDirectoryW if UNICODE macro is defined and to GetCurrentDirectoryA otherwise. Do you have UNICODE defined by any chance?
#include <windows.h>
#include <iostream>
using namespace std;
int main() {
char* file="d:/tester";
WIN32_FIND_DATA FindFileData;
HANDLE hFind;
hFind = FindFirstFile(file, &FindFileData); // line of error says argument of type char* is incompatible with parameter of type LPCWSTR
}
I can't understand the error.What is it and how can I solve the error?
I am making a console app and need to check if files are in there in the directory .
the type LPCWSTR is a const pointer to wide char
the file in char* file="d:/tester"; is a pointer to an ordinary char
Ordinary char usually uses 1 byte of memory, while wide char usually uses 2 bytes. What will happen if the file name contains cyrillic or japanese letters? You will not be able to open it without specifying the encoding. Windows API accepts wide chars to FindFirstFile function, because file name can be in a unicode. So, if you write L"foo_bar" the compiler will interpret it as wide character string. Therefore you can write wchar_t* file = L"d:\\tester"; to match parameter types, so compilation will be successful.
You are calling function that expects wide character string (FindFirstFileW). You either change file to use wchar_t* file = L"d:\\tester"; or use an ASCII version of the function FindFirstFileA.
You are compiling with UNICODE defined and yet passing an ANSI string as your first parameter. Replace your line that beings char * with
TCHAR *file=TEXT("d:\tester");
and things should be fine.
Martyn