I am making use of Matei David's handy C++ wrapper for zlib, but I get an error when compiling on macOs (clang-1100.0.33.
include/strict_fstream.hpp:39:37: error: cannot initialize a parameter of type 'const char *' with an lvalue of type 'int'
The problem is here:
/// Overload of error-reporting function, to enable use with VS.
/// Ref: http://stackoverflow.com/a/901316/717706
static std::string strerror()
{
std::string buff(80, '\0');
// options for other environments omitted, where error message is set
// if not Win32 or _POSIX_C_SOURCE >= 200112L, error message is left empty.
auto p = strerror_r(errno, &buff[0], buff.size());
// error next line
std::string tmp(p, std::strlen(p));
std::swap(buff, tmp);
buff.resize(buff.find('\0'));
return buff;
}
(Which IIUC has nothing to do with zlib, just trying to report errors in a thread safe manner).
If I change to this:
static std::string strerror()
{
std::string buff(80, '\0');
auto p = strerror_r(errno, &buff[0], buff.size());
// "fix" below
size_t length = buff.size();
std::string tmp(p, length);
std::swap(buff, tmp);
buff.resize(buff.find('\0'));
return buff;
}
My program compiles and runs fine.
I have two questions:
Why does clang not like the constructor std::string tmp(p, std::strlen(p));?
The buffer was declared at the beginning of the function as length 80. Why are we even bothering to look up the length?
The answer to 2 may answer this, but is there something wrong with my version?
Thanks.
If you use int strerror_r(int errnum, char *buf, size_t buflen);, then there is no appropriate string constructor and the program is ill formed.
If you use char *strerror_r(int errnum, char *buf, size_t buflen);, then the program is well-formed.
The standard C/POSIX library implementation influences which function you get. Compiler is only involved as much as influencing what system library may be used by default.
Former function is extension to POSIX specified in XSI (which is essentially an optional part of POSIX) and latter is a GNU extension.
In case you use glibc (I don't know if that is an option on MacOS), you can control which version you get with macros, although the XSI compliant version is not available in older versions. Its documentation says:
The XSI-compliant version of strerror_r() is provided if:
(_POSIX_C_SOURCE >= 200112L || _XOPEN_SOURCE >= 600) && ! _GNU_SOURCE
The buffer was declared at the beginning of the function as length 80. Why are we even bothering to look up the length?
In the construction std::string tmp(p, std::strlen(p));, strlen seems entirely unnecessary to me. std::string tmp(p); is equivalent.
If you don't need thread safety, then the most portable solution is to use std::strerror which is in standard C++:
return std::strerror(errno); // yes, it's this simple
If you do need thread safety, then you could wrap this in a critical section using a mutex.
Note that strerror, the name of your function, is reserved to the language implementation in the global namespace when the standard library is used. The function should be in a namespace, or be renamed.
There are two different versions of the strerror_r that you'll commonly see:
A POSIX-compliant version that always stores the error message in the provided buffer (if it succeeds) and returns an int (0 for success, non-zero for error)
A GNU version that may store the error message in the provided buffer, or maybe not. It returns a char* pointing to the error message, which may point to the user-supplied buffer or may point to some other global static storage.
That strerror function is clearly written to work with the GNU version of strerror_r.
As for your second question, you need the strlen. buff is 80 characters long, but the actual error message may be shorter and only partially fill the buffer. That strlen is being used to trim off any extra nul characters from the end.
Related
The source code itself isn't really in question. It's the compiler's reaction to it. Here's the troublesome snippet:
int XtnUtil::IsExtension(const char * filename, const char * xtn)
{
char* fx = FindExtension(filename); // Get a pointer to the filename extension
if (!fx) return 0; // Bail out if it's not there
if (xtn[0] == '.') xtn++; // Make sure we're looking at the alpha part
return (stricmp(fx, xtn) ? 0 : 1); // TRUE if they're equal
}
I've also used _stricmp instead of stricmp. In either case, the compiler gives me a particularly uninformative message:
It seems to say "Don't use _stricmp, use _stricmp instead." I tried it with and without the underscore and also tried the POSIX equivalent, strcasecmp() but Visual Studio doesn't seem to know that function at all.
For the moment, I've moved past this by simply writing my own function named mystricmp() which is kind of distasteful but seems to work. Right now I'm mostly interested in why the compiler gave me such a funky message, and what would I be able to do about it if the function I had to hand-write weren't trivial?
The title is pretty clear: Is it safe to use basename (man 3 basename) with __FILE__ ?.
It compiles and seems to work fine, but basename's argument is char* (not const char*) and the man-page says:
Both dirname() and basename() may modify the contents of path, so it may be desirable to pass a copy when calling one of these functions.
So, this makes me worry.
Maybe the question should be more like: what is the type of __FILE__? Isn't it a string literal / const char*? But if it is, why there's no a compile-time error (const char* to char*)?
Read carefully basename(3) and notice:
Warning: there are two different functions basename() - see below.
and take care of the NOTES saying
There are two different versions of basename() - the POSIX version
described above, and the GNU version, which one gets after
#define _GNU_SOURCE /* See feature_test_macros(7) */
#include <string.h>
The GNU version never modifies its argument, and returns the empty
string when path has a trailing slash
(emphasis is mine)
Because it is said that the GNU version does not modify its argument, using it is safe with __FILE__
BTW, you could consider customizing your GCC (e.g. with MELT) to define some __builtin_basename which would compute the basename at compile time if given a string literal like __FILE__ or else invoke basename at runtime.
Notice that libgen.h has #define basename __xpg_basename
what is the type of __FILE__? Isn't it a string literal / const char*?
Yes. It's a string literal.
But if it is, why there's no a compile-time error (const char* to char*)?
Possibly because the implementation you use (glibc's) may be returning a pointer within the string literal you pass (i.e. it doesn't modify its input).
In any case, you can't rely on it for the above stated below.
C standard (c11, § 6.10.8.1) says the __FILE__ is a string literal:
__FILE__ The presumed name of the current source file (a character string siteral).
POSIX says:
The basename() function may modify the string pointed to by path,
and may return a pointer to internal storage. The returned pointer
might be invalidated or the storage might be overwritten by a
subsequent call to basename().
(emphasis mine).
So, no, it's not safe to call basename() with __FILE__. You can simply take a copy of __FILE__ and do basename() on it:
char *filename = strdup(__FILE__);
if (filename) {
/* error */
}
char *file_basename = basename(filename);
free(filename);
Since __FILE__ is a string literal, using an array is another option:
char filename[] = __FILE__;
char *file_basename = basename(filename);
Is it possible to somehow adapt a c-style string/buffer (char* or wchar_t*) to work with the Boost String Algorithms Library?
That is, for example, it's trimalgorithm has the following declaration:
template<typename SequenceT>
void trim(SequenceT &, const std::locale & = std::locale());
and the implementation (look for trim_left_if) requires that the sequence type has a member function erase.
How could I use that with a raw character pointer / c string buffer?
char* pStr = getSomeCString(); // example, could also be something like wchar_t buf[256];
...
boost::trim(pStr); // HOW?
Ideally, the algorithms would work directly on the supplied buffer. (As far as possible. it obviously can't work if an algorithm needs to allocate additional space in the "string".)
#Vitaly asks: why can't you create a std::string from char buffer and then use it in algorithms?
The reason I have char* at all is that I'd like to use a few algorthims on our existing codebase. Refactoring all the char buffers to string would be more work than it's worth, and when changing or adapting something it would be nice to just be able to apply a given algorithm to any c-style string that happens to live in the current code.
Using a string would mean to (a) copy char* to string, (b) apply algorithm to string and (c) copy string back into char buffer.
For the SequenceT-type operations, you probably have to use std::string. If you wanted to implement that by yourself, you'd have to fulfill many more requirements for creation, destruction, value semantics etc. You'd basically end up with your implementation of std::string.
The RangeT-type operations might be, however, usable on char*s using the iterator_range from Boost.Range library. I didn't try it, though.
There exist some code which implements a std::string like string with a fixed buffer. With some tinkering you can modify this code to create a string type which uses an external buffer:
char buffer[100];
strcpy(buffer, " HELLO ");
xstr::xstring<xstr::fixed_char_buf<char> >
str(buffer, strlen(buffer), sizeof(buffer));
boost::algorithm::trim(str);
buffer[str.size()] = 0;
std::cout << buffer << std::endl; // prints "HELLO"
For this I added an constructor to xstr::xstring and xstr::fixed_char_buf to take the buffer, the size of the buffer which is in use and the maximum size of the buffer. Further I replaced the SIZE template argument with a member variable and changed the internal char array into a char pointer.
The xstr code is a bit old and will not compile without trouble on newer compilers but it needs some minor changes. Further I only added the things needed in this case. If you want to use this for real, you need to make some more changes to make sure it can not use uninitialized memory.
Anyway, it might be a good start for writing you own string adapter.
I don't know what platform you're targeting, but on most modern computers (including mobile ones like ARM) memory copy is so fast you shouldn't even waste your time optimizing memory copies. I say - wrap char* in std::string and check whether the performance suits your needs. Don't waste time on premature optimization.
I know there is a similarly titled question already on SO but I want to know my options for this specific case.
MSVC compiler gives a warning about strcpy:
1>c:\something\mycontrol.cpp(65): warning C4996: 'strcpy': This function or
variable may be unsafe. Consider using strcpy_s instead. To disable
deprecation, use _CRT_SECURE_NO_WARNINGS. See online help for details.
Here's my code:
void MyControl::SetFontFace(const char *faceName)
{
LOGFONT lf;
CFont *currentFont = GetFont();
currentFont->GetLogFont(&lf);
strcpy(lf.lfFaceName, faceName); <--- offending line
font_.DeleteObject();
// Create the font.
font_.CreateFontIndirect(&lf);
// Use the font to paint a control.
SetFont(&font_);
}
Note font_ is an instance variable. LOGFONT is a windows structure where lfFaceName is defined as TCHAR lfFaceName[LF_FACESIZE].
What I'm wondering is can I do something like the following (and if not why not):
void MyControl::SetFontFace(const std::string& faceName)
...
lf.lfFaceName = faceName.c_str();
...
Or if there is a different alternative altogether then let me know.
The reason you're getting the security warning is, your faceName argument could point to a string that is longer than LF_FACESIZE characters, and then strcpy would blindly overwrite whatever comes after lfFaceName in the LOGFONT structure. You do have a bug.
You should not blindly fix the bug by changing strcpy to strcpy_s, because:
The *_s functions are unportable Microsoft inventions almost all of which duplicate the functionality of other C library functions that are portable. They should never be used, even in a program not intended to be portable (as this appears to be).
Blind changes tend to not actually fix this class of bug. For instance, the "safe" variants of strcpy (strncpy, strlcpy, strcpy_s) simply truncate the string if it's too long, which in this case would make you try to load the wrong font. Worse, strncpy omits the NUL terminator when it does that, so you'd probably just move the crash inside CreateFontIndirect if you used that one. The correct fix is to check the length up front and fail the entire operation if it's too long. At which point strcpy becomes safe (because you know it's not too long), although I prefer memcpy because it makes it obvious to future readers of the code that I've thought about this.
TCHAR and char are not the same thing; copying either a C-style const char * string or a C++ std::string into an array of TCHAR without a proper encoding conversion may produce complete nonsense. (Using TCHAR is, in my experience, always a mistake, and the biggest problem with it is that code like this will appear to work correctly in an ASCII build, and will still compile in UNICODE mode, but will then fail catastrophically at runtime.)
You certainly can use std::string to help with this problem, but it won't get you out of needing to check the length and manually copy the string. I'd probably do it like this. Note that I am using LOGFONTW and CreateFontIndirectW and an explicit conversion from UTF-8 in the std::string. Note also that chunks of this were cargo-culted out of MSDN and none of it has been tested. Sorry.
void MyControl::SetFontFace(const std::string& faceName)
{
LOGFONTW lf;
this->font_.GetLogFontW(&lf);
int count = MultiByteToWideChar(CP_UTF8, MB_ERR_INVALID_CHARS,
faceName.data(), faceName.length(),
lf.lfFaceName, LF_FACESIZE - 1)
if (count <= 0)
throw GetLastError(); // FIXME: use a real exception
lf.lfFaceName[count] = L'\0'; // MultiByteToWideChar does not NUL-terminate.
this->font_.DeleteObject();
if (!this->font_.CreateFontIndirectW(&lf))
throw GetLastError(); // FIXME: use a real exception
// ...
}
lf.lfFaceName = faceName.c_str();
No you shouldn't do that because you are making a local copy of the poitner to the data held inside the std::string. If the c++ string changes, or is deleted, the pointer is no longer valid, and if lFaceName decides to change the data this will almost certainly break the std::string.
Since you need to copy a c string, you need a 'c' function, then strcpy_s (or it's equivalent) is the safe alternative
Have you tried? Given the information in your post, the assignment should generate a compiler error because you're trying to assign a pointer to an array, which does not work in C(++).
#include <cstdio>
#include <string>
using namespace std;
struct LOGFONT {
char lfFaceName[3];
};
int main() {
struct LOGFONT f;
string foo="bar";
f.lfFaceName = foo.c_str();
return 0;
}
leads to
x.c:13: error: incompatible types in assignment of `const char*' to `char[3]'
I'd recommend using a secure strcpy alternative like the warning says, given that you know the size of the destination space anyway.
#include <algorithm>
#include <iostream>
#include <string>
enum { LF_FACESIZE = 256 }; // = 3 // test too-long input
struct LOGFONT
{
char lfFaceName[LF_FACESIZE];
};
int main()
{
LOGFONT f;
std::string foo("Sans-Serif");
std::copy_n(foo.c_str(), foo.size()+1 > LF_FACESIZE ? LF_FACESIZE : foo.size()+1,
f.lfFaceName);
std::cout << f.lfFaceName << std::endl;
return 0;
}
lf.lfFaceName = faceName.c_str(); won't work for two reasons (assuming you change faceName to a std:string)
The lifetime of the pointer returned by c_str() is temporary. It's only valid as long as the fileName object doesn't change and in alive.
The line won't compile. .c_str() returns a pointer to a char, and lfFaceName is a character array and can't be assigned to. You need to do something to fill in the string array, to fill in the bytes at lfFaceName, and pointer assignment doesn't do that.
There isn't anything C++ that can help here, since lfFaceName is a C "string". You need to use a C string function, like strcpy or strcpy_s. You can change your code to:
strcpy_s(lf.lfFaceName, LF_FACESIZE, faceName);
During testing I have a mock object which sets errno = ETIMEDOUT; The object I'm testing sees the error and calls strerror_r to get back an error string:
if (ret) {
if (ret == EAI_SYSTEM) {
char err[128];
strerror_r(errno, err, 128);
err_string.assign(err);
} else {
err_string.assign(gai_strerror(ret));
}
return ret;
}
I don't understand why strerror_r is returning trash. I even tried calling
strerror_r(ETIMEDOUT, err, 128)
directly and still got trash. I must be missing something. It seems I'm getting the gnu version of the function not the posix one, but that shouldn't make any difference in this case.
Edit
I'm on Ubuntu 8.04. glibc version looks like 2.7 in features.h.
According to this page, http://linux.die.net/man/3/strerror_r, if you're using the GNU version of strerror_r(), the function might decide to not store anything at all into the buffer you provide; you'd need to use the string returned from the function (seems like a rather bizarre interface):
The GNU-specific strerror_r() returns a pointer to a string containing the error message. This may be either a pointer to a string that the function stores in buf, or a pointer to some (immutable) static string (in which case buf is unused). If the function stores a string in buf, then at most buflen bytes are stored (the string may be truncated if buflen is too small) and the string always includes a terminating null byte.
So if you happen to be using the GNU version you can't rely on your err buffer to have anything useful put into it, unless strerror_r() returns the address of err.
The problem is that Linux has two strerror_r functions. If you read the manual,
int strerror_r(int errnum, char *buf, size_t buflen);
/* XSI-compliant */
char *strerror_r(int errnum, char *buf, size_t buflen);
/* GNU-specific */
The XSI-compliant version of strerror_r() is provided if:
(_POSIX_C_SOURCE >= 200112L || _XOPEN_SOURCE >= 600) && ! _GNU_SOURCE
The g++ compiler automatically defines _GNU_SOURCE, so you are using the GNU version of the function which doesn't necessarily store a message in the buffer you provide.