Converting string to LPWSTR [duplicate] - c++

This question already has answers here:
How to convert std::string to LPCWSTR in C++ (Unicode)
(6 answers)
Closed 8 years ago.
I'm having a hard time converting a string into LPWSTR so I can use the PathStripToRoot() function.
Well for one the MSDN documation says I need LPTSTR variable (http://msdn.microsoft.com/en-us/library/windows/desktop/bb773757(v=vs.85).aspx), but Visual Studio 2013 says I need LPWSTR.
Here is a code snippet of my function:
fileStat fileCreate(const string& targetFile)
{
fileStat filez;
fstream file(targetFile.c_str());
if (!file)
{
cout << "File does not exist" << endl;
}
std::ifstream in(targetFile, ios::binary | ios::ate);
int a = in.tellg();
cout << "File size(bytes): " << in.tellg() << endl << endl;
file.close();
wstring stemp = strChange(targetFile);
LPCWSTR result = stemp.c_str();
/* Tried the below code but that did not work
LPWSTR ws = new wchar_t[targetFile.size() + 1];
copy(targetFile.begin(), targetFile.end(), ws);
ws[targetFile.size()] = 0;
*/
cout<<"\n\n"<<PathStripToRoot(ws)<<"\n\n";
...
filez.fileSize = a;
return filez;
}
A lot of people have said to use MultiByteToWideChar() function but I looked at the MSDN documation and have no idea how it works. Is there an easier way than using MultiByteToWideChar() ?

You may want to use Unicode UTF-16 strings in modern Windows applications when dealing with Win32 APIs: the std::wstring class (based on wchar_t) is OK for that with Visual C++.
Then, you can wrap the Win32 C API PathStripToRoot() in some C++ code, using convenient string classes instead of raw C-like string buffers.
Consider the following commented code as an example:
// Set Unicode mode
#define UNICODE
#define _UNICODE
// Windows SDK Headers
#include <Windows.h> // Win32 Platform SDK
#include <Shlwapi.h> // For PathStripToRoot()
#include <Strsafe.h> // For StringCchCopy()
// Standard C++ Headers
#include <exception> // For std::exception
#include <iostream> // For console output
#include <stdexcept> // For std::invalid_argument, std::runtime_error
#include <string> // For std::wstring
// For using PathStripToRoot()
#pragma comment(lib, "Shlwapi.lib")
// C++ wrapper around PathStripToRoot() Win32 API
std::wstring RootFromPath(const std::wstring& path)
{
// Buffer for PathStripToRoot()
wchar_t pathBuffer[MAX_PATH];
// Copy the input string into the buffer.
// Beware of buffer overruns!
HRESULT hr = ::StringCchCopy(pathBuffer, // dest
_countof(pathBuffer), // dest size
path.c_str()); // source
if (hr == STRSAFE_E_INSUFFICIENT_BUFFER)
{
// Copy failed due to insufficient buffer space.
// May accept this case or throw an exception
// based on the context...
// In this case, I just throw here.
throw std::invalid_argument("RootFromPath() - Path string too long.");
}
if (hr != S_OK)
{
throw std::runtime_error("RootFromPath() - StringCchCopy failed.");
}
// Call the Win32 C API using the raw C buffer
if (! ::PathStripToRoot(pathBuffer))
{
// No valid drive letter was found.
// Return an empty string
return std::wstring();
}
// Return a std::wstring with the buffer content
return std::wstring(pathBuffer);
}
// Test
int main()
{
try
{
const std::wstring path = L"C:\\Path1\\Path2";
const std::wstring root = RootFromPath(path);
std::wcout << "The content of the path before is:\t" << path << std::endl;
std::wcout << "RootFromPath() returned: \t" << root << std::endl;
}
catch(const std::exception& ex)
{
std::cerr << "\n*** ERROR: " << ex.what() << std::endl;
}
}
Compiled from command line:
C:\Temp\CppTests>cl /EHsc /W4 /nologo TestPathStripToRoot.cpp
Output:
C:\Temp\CppTests>TestPathStripToRoot.exe
The content of the path before is: C:\Path1\Path2
RootFromPath() returned: C:\
On that particular point of your question:
Well for one the MSDN documation says I need LPTSTR variable, but
Visual Studios says I need LPWSTR.
LPTSTR is a typedef equivalent to TCHAR*.
LPWSTR is a typedef equivalent to WCHAR*, i.e. wchar_t*.
TCHAR is a placeholder for a character type, that can be expanded to char or wchar_t, depending if you are in ANSI/MBCS or Unicode build mode.
Since VS2005, Visual Studio has been using Unicode builds as default.
So, unless you are maintaining an old legacy app that must use ANSI/MBCS, just use Unicode in modern Win32 applications. In this case, you can directly use wchar_t-based strings with Win32 APIs, without bothering with the old obsolete TCHAR-model.
Note that you can still have std::strings (which are char-based) in your code, e.g. to represent Unicode UTF-8 text. And you can convert between UTF-8 (char/std::string) and UTF-16 (wchar_t/std::wstring) at the Win32 API boundaries.
For that purpose, you can use some convenient RAII wrappers to raw Win32 MultiByteToWideChar() and WideCharToMultiByte() APIs.

The right way to think about building a Windows application is to pretend that 8-bit strings do not exist. Otherwise, the encoding of your string will vary based on the user's language settings, and your app will not be able "global ready" because there will always be some characters not representable by the user's current settings. 8-bit strings in Win32 are legacy from the 1990s and a good Win32 app uses PWSTR everywhere. Notice for instance that on Windows CE or WinRT the "A functions" don't even exist, that should give you some hint about how Microsoft feels about the issue.
Now, in practical terms, you may be interacting with non-Windows specific code that uses 8-bit strings. IMO the best approach to use for that is to say by convention that all such strings are UTF-8, and use MultiByteToWideChar and WideCharToMultiByte to convert to and from PWSTR. Be sure to use CP_UTF8. But for Windows specific code, please do define the UNICODE and _UNICODE macros, forget that TCHAR, TSTR, *A functions and other such accidents of history exist and use PWSTR and WCHAR everywhere. Your code will be saner for it.

You can use ATL conversion macros:
cout<<"\n\n"<<PathStripToRoot(CA2T(targetFile.c_str()))<<"\n\n";

If targetFile is ASCII string, use PathStripToRootA. Here you do not need any conversion, targetFile.c_str() will work.
If targetFile is UTF8 string, use MultiByteToWideChar to convert it to WideChar. Then use PathStripToRoot.
Otherwise make targetFile wstring, pass it to API without any conversion.

Please go through
Unicode Programming Summary
There are several ways to solve this.
The right approach is to enclose the string definitions with _T(). PathStripToRoot is defined as
#ifdef _UNICODE
#define PathStripToRoot PathStripToRootA
#else
#define PathStripToRoot PathStripToRootW
#endif
_T, and Windows APIs follows what you define the character set support you defined for your project under project settings. If you're taking file name as single byte string (string holds ANSI string) use PathStripToRootA. You can avoid converting file name in between to UNICODE string.

Related

Why does LoadLibrary fail whilst LoadLibraryA succeeds in loading a DLL? [duplicate]

This question already has answers here:
Cast to LPCWSTR?
(4 answers)
Closed 3 years ago.
I'm trying to load a DLL into C++ but was getting error code 126, which I think means the DLL couldn't be found. After some poking around I changed LoadLibrary to LoadLibraryA and suddendly it worked. However, I am at a complete loss as to why. I realise that I haven't provided the dll for this code to be runable but would be greatful if somebody could provide an explaination as to why this is happening? And prehaps an example of how to get LoadLibary working.
Broken version
#include <stdio.h>
#include <windows.h>
typedef char* (*gf_getCurrentLibraryVersion) ();
int main() {
gf_getCurrentLibraryVersion getVersion;
HINSTANCE hLib = LoadLibrary((LPCWSTR)"libsbnw.dll");
if (hLib) {
getVersion = (gf_getCurrentLibraryVersion)GetProcAddress(hLib, "gf_getCurrentLibraryVersion");
printf("Version = %s\n", getVersion());
}
else {
printf("Error loading dll: %d/n", GetLastError());
}
printf("Hit any key to continue\n");
getchar();
return 0;
}
Compiles and outputs
Error loading dll: 126/nHit any key to continue
to console
Working version
#include <stdio.h>
#include <windows.h>
typedef char* (*gf_getCurrentLibraryVersion) ();
int main() {
gf_getCurrentLibraryVersion getVersion;
HINSTANCE hLib = LoadLibraryA("libsbnw.dll");
if (hLib) {
getVersion = (gf_getCurrentLibraryVersion)GetProcAddress(hLib, "gf_getCurrentLibraryVersion");
printf("Version = %s\n", getVersion());
}
else {
printf("Error loading dll: %d/n", GetLastError());
}
printf("Hit any key to continue\n");
getchar();
return 0;
}
Compiles and outputs
version is: 1.3.4
The problem with your LoadLibrary((LPCWSTR)"libsbnw.dll") call is that your build environment converts that to a LoadLibraryW call, but the way you are trying to pass a wide-character string is wrong.
As you have it, you are simply casting a const char* pointer to a const wchar_t* pointer, which won't work (for example, it will interpret the initial "li" characters as a single 16-bit character).
What you need to do is specify the string literal as a wide character constant, using the L prefix:
HINSTANCE hLib = LoadLibrary(L"libsbnw.dll");
Or, alternatively, using the TEXT() macro (which will boil down to the same, when using the UNICODE build environment):
HINSTANCE hLib = LoadLibrary(TEXT("libsbnw.dll"));
Feel free to ask for further explanation and/or clarification.
Ordinarily the compiler will try to point out when you're making a mistake. But in this case you've told it not to by adding an explicit cast to the string.
HINSTANCE hLib = LoadLibrary((LPCWSTR)"libsbnw.dll");
//^^^^^^^^^
I'm assuming you've built your app with Unicode enabled, which defines a macro converting LoadLibrary to LoadLibraryW. The parameter must be a wide-character string.
HINSTANCE hLib = LoadLibraryW(L"libsbnw.dll");
There's another macro you can use when you're not sure if the app will be compiled with Unicode or not, TEXT() or the shorter form _T(). Not recommended for modern code since needing to turn Unicode on or off hasn't been a problem in many years, just use Unicode always.
HINSTANCE hLib = LoadLibrary(TEXT("libsbnw.dll"));

Can't convert Char to LPWSTR [duplicate]

This question already has answers here:
Argument of type "char *" is incompatible with parameter of type "LPWSTR"
(2 answers)
Closed 6 years ago.
The following code
string exePath() {
string path;
char buffer[MAX_PATH];
cout << "reading path\n";
GetModuleFileName(NULL, buffer, MAX_PATH);
string::size_type pos = string(buffer).find_last_of("\\/");
path = string(buffer).substr(0, pos)/*+"\\system.exe"*/;
return path;
}
gives me error at the second parameter in VisualStudio (buffer):
the type "char *" argument is incompatible with the type parameter
"LPWSTR"
(translated from italian,i have vs in italian, hope you understand)
and
can't convert Char to LPWSTR in the 2 argument
This code compiles fine with code::blocks and dev c++ but in vs it doesn't
Because GetModuleFileName is a macro definition for GetModuleFileNameW if UNICODE is defined and which requires wchar_t*. If no UNICODE is defined then GetModuleFileName resolves to GetModuleFileNameA. The simple fix for you is to use explicitly GetModuleFileNameA which accepts char*
This code compiles fine with code::blocks and dev c++ but in vs it doesn't
most probably because with code::blocks you compile with no UNICODE defined, while under VS by default UNICODE is defined.
If you are not using UNICODE, and want to stay with multibyte string (same as in code::blocks) you can disable UNICODE in your Visual Studio build by changing in project properties on General tab -> Character set to Use Multi-Byte Character Set.
[edit]
To avoid problems with UNICODE characters in the path it is recomended to use UNICODE. This requires you to use std::wstring which is specialization for wchar_t, also instead of char buffer[MAX_PATH]; you should use wchar_t buffer[MAX_PATH];. cout << "reading path\n"; would have to be wcout << L"reading path\n";. You can also use macros like TCHAR or _T("reading path"), to make it independently work for UNICODE and non-UNICODE builds.
I believe this is related How I can print the wchar_t values to console?. You must also prefix your strings with L to make them wide-char like this
std::wcout << L"reading path" << std::endl;
This will work well if your application is for Windows only, but you should use the TEXT("reading path") macro style otherwise, because even though it's not there by default for other platforms, you can add it easily.

Argument of type "char *" is incompatible with parameter of type "LPWSTR"

This has probably been asked before but I can't seem to find the solution:
std::string GetPath()
{
char buffer[MAX_PATH];
::GetSystemDirectory(buffer,MAX_PATH);
strcat(buffer,"\\version.dll");
return std::string(buffer);
}
This returns an error stating:
argument of type "char *" is incompatible with parameter of type "LPWSTR"
So yeah. Anyone got an answer?
You need to use the ansi version:
std::string GetPath()
{
char buffer[MAX_PATH] = {};
::GetSystemDirectoryA(buffer,_countof(buffer)); // notice the A
strcat(buffer,"\\version.dll");
return std::string(buffer);
}
Or use unicode:
std::wstring GetPath()
{
wchar_t buffer[MAX_PATH] = {};
::GetSystemDirectoryW(buffer,_countof(buffer)); // notice the W, or drop the W to get it "by default"
wcscat(buffer,L"\\version.dll");
return std::wstring(buffer);
}
Rather than call the A/W versions explicitly you can drop the A/W and configure the whole project to use ansi/unicode instead. All this will do is change some #defines to replace foo with fooA/W.
Notice that you should use _countof() to avoid incorrect sizes depending on the buffers type too.
If you compile your code using MultiByte support it will compile correctly,but when you compile it using Unicode flag it will give an error because in Unicode support ::GetSystemDirectoryA becomes ::GetSystemDirectoryW use consider using TCHAR instead of char.TCHAR is defined such that it becomes char in Multibyte flag and wchar_t with Unicode flag
TCHAR buffer[MAX_PATH];
::GetSystemDirectory(buffer,MAX_PATH);
_tcscat(buffer,_T("\\version.dll"));
You can use typedef for string /wstring so your code becomes independent
#ifdef UNICODE
typedef wstring STRING;
#else
typedef string STRING;
#endif
STRING GetPath()
{
TCHAR buffer[MAX_PATH];
::GetSystemDirectory(buffer,MAX_PATH);
_tcscat(buffer,_T("\\version.dll"));
return STRING(buffer);
}

GetCurrentDirectory buffer doesn't return the right value

I have an issue with GetCurrentDirectory(), and i don't really understand why. The thing i don't understand is that it works for XP but not for Seven (or at least on my computer). There is my code:
char dir_name[1024]; // as a global variable
int get_files() {
// ...
DWORD dwRet;
dwRet = GetCurrentDirectory(MAX_PATH, dir_name);
printf("%s\n",dir_name);
printf("%d\n",dwRet);
//...
}
This code will return:
printf("%s\n",dir_name); -> return "c"
printf("%d\n",dwRet); -> 42 (which is the right length of the string that should be returned)
I don't understand why dir_name only takes the value "c".
I think, the result is Unicode in Windows Seven! and after each ascii character of this function there is zero. And you are printing it by printf. You should use wide-char functions in your program. Like wprintf.
Try below code: (Tested in Visual Studio 2008 + Windows 7)
#include <stdio.h>
#include <windows.h>
#include <wchar.h>
WCHAR dir_name[1024]; // as a global variable
int get_files()
{
// ...
DWORD dwRet;
dwRet = GetCurrentDirectory(MAX_PATH, dir_name);
wprintf(L"%s\n", dir_name);
printf("%d\n", dwRet);
//...
return 0;
}
Im not sure, but could it be GetCurrentDirectory() returns 2-byte chars under win7?
In such case you'll be getting a 0 in each second bytes of the char array returned.
So you should use a wide-char aware version of the printf() function such as wprintf().
Also I wonder whether the compiler wouldn't have warned you about something being wrong regarding types.
what compiler are you using? Under Visual C++ 2005, GetCurrentDirectory is a macro that resolves to GetCurrentDirectoryW if UNICODE macro is defined and to GetCurrentDirectoryA otherwise. Do you have UNICODE defined by any chance?

ReadFile Win32 API

i want to read a file.. but.. when i debug my program it runs but a pop up appears and says system programming has stopped working and in the console, it`s written that Press enter to close the program. my code is ::
// System Programming.cpp : Defines the entry point for the console application.
//
#include "stdafx.h"
#include "iostream"
using namespace std;
int _tmain(int argc, _TCHAR* argv[])
{
HANDLE hin;
HANDLE hout;
TCHAR buff[20]= {'q','2','3'};
TCHAR buff2[20]={'a','v'};
hin = CreateFile(_T("Abid.txt"),GENERIC_WRITE,0,NULL,OPEN_EXISTING,FILE_ATTRIBUTE_NORMAL,0);
if(hin == INVALID_HANDLE_VALUE)
{
cout<<"error";
}
WriteFile(hin,buff,40,0,NULL);
CloseHandle(hin);
hout = CreateFile(_T("Abid.txt"),GENERIC_READ,0,NULL,OPEN_EXISTING,FILE_ATTRIBUTE_NORMAL,0);
if(hout == INVALID_HANDLE_VALUE)
{
cout<<"error";
}
ReadFile(hout,buff2,40,0,NULL);
CloseHandle(hout);
return 0;
}
According to MSDN, lpNumberOfBytesWritten paremeter can be NULL only when the lpOverlapped parameter is not NULL. So the calls should be
DWORD nWritten;
WriteFile(hin, buff, 40, &nWritten, NULL);
and
DWORD nRead;
ReadFile(hout, buff2, 40, &nRead, NULL);
Also, rename hin and hout.
Others have already answered your question. This is about the code.
// Your code:
// System Programming.cpp : Defines the entry point for the console application.
//
Just remove that comment. It isn't true. :-) The entry point for your program is where the machine code starts executing, and with the Microsoft toolchain it's specified by the /entry linker option.
Note that Microsoft's documentation is generally confused about entry points, e.g. it has always, one way or other, documented incorrect signature for entry point.
It's one of the most infamous Microsoft documentation errors, and, given that it's persisted, in various forms, for 15 years, I think it says something (not sure exactly what, though).
// Your code:
#include "stdafx.h"
You don't need this automatically generated header. Instead use <windows.h>. A minimal way to include <windows.h> for your program would be
#undef UNICODE
#define UNICODE
#include <windows.h>
For C++ in general you'll want to also make sure that STRICT and NOMINMAX are defined before including <windows.h>. With modern tools at least STRICT is defined by default, but it doesn't hurt to make sure. Without it some of declarations won't compile with a C++ compiler, at least not without reinterpret casts, e.g. dialog procedures.
// Your code:
#include "iostream"
using namespace std;
Almost OK.
Do this:
#include <iostream>
using namespace std;
The difference is where the compiler searches for headers. With quoted name it searches in some additional places first (and that's all that the standard has to say about it). With most compilers those additional places include the directory of the including file.
// Your code:
int _tmain(int argc, _TCHAR* argv[])
Oh no! Don't do this. It's a Microsoft "feature" that helps support Windows 9.x. And it's only relevant when you're using MFC linked dynamically and you're targeting Windows 9.x; without MFC in the picture you'd just use the Microsoft Unicode layer.
Area you really targeting Windows 9.x with an app using dynamically linked MFC?
Instead, do ...
int main()
... which is standard, or use the Microsoft language extension ...
int wMain( int argc, wchar_t* argv[] )
... if you want to handle command line arguments the "easy" way.
// Your code:
{
HANDLE hin;
HANDLE hout;
TCHAR buff[20]= {'q','2','3'};
TCHAR buff2[20]={'a','v'};
The TCHAR stuff is just more of that MFC in Windows 9.x support stuff.
Apart from being totally unnecessary (presumably, you're not really targeting Windows 9.x, are you?), it hides your intention and hurts the eyes.
Did you mean ...
char buff[20] = {'q', '2', '3'};
... perhaps?
// Your code:
hin = CreateFile(_T("Abid.txt"),GENERIC_WRITE,0,NULL,OPEN_EXISTING,FILE_ATTRIBUTE_NORMAL,0);
if(hin == INVALID_HANDLE_VALUE)
{
cout<<"error";
}
As others have mentioned, OPEN_EXISTING isn't logical when you're creating the file, and the count pointer argument can't be 0 for your usage.
When using <windows.h>, with UNICODE defined as it should be, the filename argument should be specifed as L"Abid.txt".
Cheers & hth.,
The problem is that you're passing a NULL pointer in for the lpNumberOfBytesWritten/lpNumberOfBytesread parameter. While this is an optional parameter, there's a condition:
This parameter can be NULL only when the lpOverlapped parameter is not NULL
Also, you may have the size of your buffers wrong:
WriteFile(hin,buff,40,0,NULL); // says that buff has 40 bytes
ReadFile(hout,buff2,40,0,NULL); // says that buff2 has 40 bytes
But if you're compiling for ANSI instead of UNICODE, these will only be 20 bytes in size.
You should probably use sizeof(buff) and sizeof(buff2) instead.
Assuming your initial code attempts to create the file as a new file, then you cannot use OPEN_EXISTING, you have to use OPEN_ALWAYS (or some other creational variant) on this call.
The OPEN_EXISTING usage for readback will be OK.
btw once this is fixed the WriteFile calls causes an access violation, as you are trying to write more bytes that your array contains.