I am trying to use the PasswordFilter function, and need to get Password variable value, which is a PUNICODE_STRING, then use regex_match to match a password policy.
The problem is that regex_match cannot recognize the PUNICODE_STRING.
What can I do?
Strings stored as a LSA_UNICODE_STRING (or its typedefs) might not be null-terminated so passing the Buffer pointer to a function that is expecting a null-terminated string (or a std::wstring) is not guaranteed to be safe.
Instead, convert it to a std::wstring using the Length field to specify the length of the string:
PUNICODE_STRING pStringIn; // this comes from somewhere
std::wstring strOut(pStringIn->Buffer, pStringIn->Length / sizeof(wchar_t));
You can then use strOut.c_str() or pass it directly to functions that accept a std::wstring.
Related
ifstream inFile;
inFile.open("C:/FilePathThatWorks");
The above seems to work in C++. But if I try and take a String, or CString, or anything else, and plug it in for inFile.open(ExampleString), it fails at compile time (error at the bottom of this question). The question is not about my code, but how to make C++ accept a variable for inFile.
If it only ever accepts char* variables, then is there perhaps a method that takes input from the user in the form of a char* with the null terminator and everything.
error: no matching function for call to 'std::basic_ifstream<char>::open(std::__cxx11::string&)'
The method open in fstream does not accept std::string. You need to pass a char*.
So just call c_str() on string.
ifstream requires a char* param for the filename. see http://www.cplusplus.com/reference/fstream/ifstream/ifstream/
most string classes have some way to convert to this kind of string (e.g. c_str())
If you want to use CString:
CString path = "c:\\FilePathThatWorks";
ifstream inFile;
inFile.open(static_cast<const char*>(path));
The cast is not needed here.
CString can represent two different strings depending on the project settings. If it is a string of char the code should compile, if the character type is wchar_t compiling should fail.
I have moved my strings to resources and luckily I have LPCTSTR operator to instantiate strings conveniently like:
CString str( (LPCSTR) IDS_MY_STRING);
Now I want to do similar type casting with MessageBox() so it loads the strings from resources as well so I go about like this:
MessageBox( hWnd, (LPCTSTR) IDS_MY_STRING ,"Error", MB_RETRYCANCEL);
But this doesn't work, it compiles but crashes at run time. Now the following do work:
MessageBox( hWnd, (CString) (LPCTSTR) IDS_MY_STRING ,"Error", MB_RETRYCANCEL);
My question is that MessageBox() takes LPCTSTR as 2nd parameter anyways so why do we have to typecast additionally from LPCTSTR to CString to make this work?
Your IDS_MY_STRING isn't really a pointer to a string. It's an integer. (If it were a string pointer, you'd never need the LPCTSTR cast in the first place.) CString knows how to load resource strings from integral resource IDs.
MessageBox doesn't; it requires a real character pointer, which CString provides implicitly.
The real question (or at least the interesting part of the answer) is less about how the second fails, and more about how the first works.
The first works because CString's constructor that takes an LPCSTR actually looks at the value to figure out whether it's really a pointer to a string, or the identifier of a string resource. In the latter case, it automatically loads the string resource and creates a CString with the same content. IOW, you've getting an implicit conversion from string identifier to CString.
CString also supports an implicit conversion to LPCSTR/LPCSTR/LPCWSTR.
C++, however, will only do one user-defined implicit conversion to get from whatever type is passed to whatever type is needed for an expression. In this case, to get from a string ID to a LPCTSTR, you'd need two -- one from string ID to CString, and another from CString to LPCTSTR. The compiler won't do that for you automatically.
Therefore, to get from a string ID to an LPCTSTR, you need to explicitly convert from string ID to CString, which uses CString's constructor that takes an LPCTSTR. Therefore, you cast your string ID to LPCTSTR, and from that to CString, which creates a CString. Then the compiler will automatically convert from CString to a (real) `LPCTSTR for you.
Others have explained the details of type casts, etc.
Moreover, to simplify your code, you may want to #define a convenient macro like this:
#define _S(id) (CString(LPCTSTR(id)))
and then use it with MessageBox (or for other LPCTSTR parameters as well):
MessageBox( hWnd, _S(IDS_MY_STRING), _S(IDS_TITLE), MB_RETRYCANCEL );
MessageBox does not have an overload taking a resource ID but you can use AfxMessageBox instead.
I am trying to finalize my logging class. I have written it from scratch and do not wish to use an alternative library of any kind. My problem lies within the fact that my logger has trouble outputting std::strings and only works when I denote it with the string.c_str() function.
Here is my logfile output function:
void Log::writeSuccess(char * text,...)
{
// Grab the variables and insert them
va_list ap;
va_start(ap, text);
char buff[BUFFER_SIZE];
vsnprintf(buff, sizeof(buff), text, ap);
// Output to the log
logfile << "<-!-> " << buff << endl;
}
Here is a sample call to my log class object (ignore the uselessness of the call):
string test("This is a test string!");
errorLog.writeSuccess("Output: %s", test);
I end up with random characters and garbled output.
However, when I append the string, test with .c_str(), it outputs the text correctly.
The whole reason I am trying to avoid cstrings is because I understand they are not cross platform and am developing my client to support all the major operating systems.
To summarize:
What is wrong with my log output function? Do you see any way it could be improved?
Should I generally avoid c_strings?
You're getting random garble when passing an std::string to vsnprintf because the format specifier "%s" is that of a C-string - a char*.
std::string is not of type char*, but std::string.c_str() is of type char*. vsnprintf will basically read chars pointed to by the address that it presumes is that of a start of a C-string, up until the NUL character '\0'.
An std::string pushed onto the stack and passed as an argument to vsnprintf is not a pointer to a char, however vsnprintf will just treat these bytes as an address and start reading chars/bytes from this address, causing undefined behaviour.
The printf family of functions are not typesafe, since they rely on a format string and variable argument list, which is why your code will compile but you'll get unexpected results.
Bottom line is the printf family of functions expect a char* when you use the format specifier "%s".
I also think you're confusing C style strings (char[]) with the Microsoft-specific CString class. C style strings won't cause you problems on different platforms at all; the literal "This is a test string!" is a C style string (const char[]).
When calling function with variable parameters you must use simple types. For string you must use c_str(). There's no workaround. MFC's CString is designed so that you can get away with using it directly, but that was Microsoft's decision, and part of their design.
EDIT: As I said when calling function with variable parameters you must use string::c_str(). However, instead of C-like functions with variable parameters you can use something like boost::format() and it's parameter feeding operator %. This also gives you more control over ordering of parameters, which is very handy for i18n.
I have this code in my VB6 app:
Private Declare Function FileGetParentFolder Lib "Z-FileIO.dll" _
(ByVal path As String) As String
Output.AddItem FileGetParentFolder(FileText.Text)
Output is a list, FileText is a text field containing a file path. My C++ DLL contains this function:
extern "C" BSTR ZFILEIO_API FileGetParentFolder(Path p)
{
try {
return SysAllocString(boost::filesystem::path(p).parent_path().c_str());
} catch (...) {
return SysAllocString(L"");
}
}
where Path is typedef'd as LPCSTR. The argument comes into my DLL perfectly, but whatever I try to pass back, the VB6 app shows only garbage. I tried several different methods with SysAllocStringByteLength, casting the SysAllocString argument to LPCWSTR and other variants. Either, I only see the first letter of the string, or I see only Y's with dots, just not the real string. Does anyone know what the real method is for creating and passing valid BSTRs from C++ to VB6?
Hopefully this will point you in the right direction. From memory...
VB6 uses COM BSTRs (2-byte wide character strings) internally, but when communicating with external DLLs it uses single- or multi-byte strings. (Probably UTF-8, but I don't remember for sure.) Your Path typedef to LPCSTR is an ANSI string, and that's why you can receive it correctly. The return value you generate is a wide-character string, but VB is expecting an ANSI string. You'll need to use WideCharToMultiByte to convert your return value before returning it.
Seems a little odd that VB does this implicit conversion, but that's the way it is. (As far as I remember.)
If you insist on using the function signature then you have to prepare a custom typelib for VB6 that includes this
[dllname("Z-FileIO.dll")]
module ZFileIO
{
[entry("FileGetParentFolder")]
BSTR FileGetParentFolder ([in] LPWSTR path);
};
In Declares param-types As String are automagically converted to ANSI string, i.e. LPSTR. The only way to pass/receive a unicode string (LPWSTR or BSTR) is by using typelib API function declaration.
Other than that you can always use As Long params in the declaration and expect LPWSTRs but then the consumer will have to wrap strings in StrPtr on every call to the API function.
I am using the ICU library in C++ on OS X. All of my strings are UnicodeStrings, but I need to use system calls like fopen, fread and so forth. These functions take const char* or char* as arguments. I have read that OS X supports UTF-8 internally, so that all I need to do is convert my UnicodeString to UTF-8, but I don't know how to do that.
UnicodeString has a toUTF8() member function, but it returns a ByteSink. I've also found these examples: http://source.icu-project.org/repos/icu/icu/trunk/source/samples/ucnv/convsamp.cpp and read about using a converter, but I'm still confused. Any help would be much appreciated.
call UnicodeString::extract(...) to extract into a char*, pass NULL for the converter to get the default converter (which is in the charset which your OS will be using).
ICU User Guide > UTF-8 provides methods and descriptions of doing that.
The simplest way to use UTF-8 strings in UTF-16 APIs is via the C++ icu::UnicodeString methods fromUTF8(const StringPiece &utf8) and toUTF8String(StringClass &result). There is also toUTF8(ByteSink &sink).
And extract() is not prefered now.
Note: icu::UnicodeString has constructors, setTo() and extract() methods which take either a converter object or a charset name. These can be used for UTF-8, but are not as efficient or convenient as the fromUTF8()/toUTF8()/toUTF8String() methods mentioned above.
This will work:
std::string utf8;
uStr.toUTF8String(utf8);