Can't get std::from_chars to auto detect base - c++

I'm trying to use std::from_chars to convert std::string to integer specifically like this:
//selection_msg is read from a file and is a std::string
int selection_data;
auto result = std::from_chars(selection_msg.data(),
selection_msg.data() + selection_msg.size(),
selection_data);
Now this works for a decimal selection_msg = "1234", selection_data = 1234 and the result ec=0.
But if selection_msg = "0xABC", selection_data = 0 the result ec=0 but importantly the ptr="xABC" indicating the x isn't part of a recognised pattern.
Note: IF I add the base:
auto result = std::from_chars(selection_msg.data(),
selection_msg.data() + selection_msg.size(),
selection_data,
16);
selection_msg = "ABC" parses just fine, but obviously I can't decode a parse a decimal.
The spec https://en.cppreference.com/w/cpp/utility/from_chars seems to suggest "0x" should be a valid pattern am reading it wrong?
I was hoping to use the auto base detection to make the file input a bit more flexible.
Can anyone see what I'm doing wrong?
P.S. all tested in VS2019.

There is no auto base detection for from_chars. It's a very low-level interface designed for performance, not flexibility.
If you want auto-detection, use stoi and friends.

Related

QString::number result is wrong

I was practicing converting numbers in Qt and ran into a problem. I have a variable of type int - result, its value is, for example, 11111 (I get it in a certain way). I want this number to be considered binary in my program. To do this, I'm trying to translate it into a string in variable res and add "0b" in front of my value, like
QString res = "0b" + QString:number(result);
My variable res is "0b11111" and that's right, but I want to get a variable of type int, so I'm trying to cast a string to it:
result = res.toInt();
and in the end I get 0. I understand that this is most likely due to the second character "b", but is there really no way to convert the number to the binary system as it is?
Or did I made a mistake somewhere?
Thank you for all answers, that could helps me to understand what’s wrong!
With Qt, you can specify the base, if you look carefully at the QString documentation, you could simply write:
res = QString::number(11111);
result = res.toInt(nullptr, 2); // Base 2
Or even better:
bool success; // Can be used to check if the conversion succeeded
res = QString::number(11111);
result = res.toInt(&success, 2);
In order for the QString.toInt() function to use the "C Convention" and determine which base to use according to known prefixes (the "0b" in your case), you need to explicitly specify a base of zero in the call:
QString res = "0b" + QString:number(result);
result = res.toInt(nullptr, 0); // If "base" is zero, parse/use the "0b"
Otherwise, if no "base" argument is given, it will default to 10 (i.e. decimal) and, in that case, only the leading zero in your string will be parsed, because the 'b' character is not a valid decimal digit.
Alternatively, you can skip the leading "0b" characters and explicitly tell the function to use base 2:
QString res = QString:number(result); // No added "0b" prefix
result = res.toInt(nullptr, 2); // Force use of binary
So you have an integer variables with the value eleven thousand one hundred and eleven, but you want it to be considered binary and therefore a value of thirty one? Can't help thinking that you are solving the wrong problem.
But anyway, convert to a std::string using std::to_string and then use the std::stoi function to convert back to an integer. std::stoi allows you to specify binary a binary conversion.
int result = 11111;
int res = std::stoi(std::to_string(result), nullptr, 2);
std::cout << res << '\n';

using libfmt to format to a string

using libfmt to print to a file is very convenient:
auto file = fmt::output_file(filename);
file.print(...);
But how can I format to a memory buffer, ultimatiley converting to a string? I would imagine something like
auto buf = some_buffer_object{};
buf.print(...);
std::string s = buf.get_string();
But I can find no such buffer type in the documentation (fmt::memory_buffer seems related, but does not work like this).
Important: I need multiple calls to print, so auto s = fmt::format(...) is not an option.
how can I format to a memory buffer, ultimatiley converting to a string?
Use format_to and print to an iterator the appends to a string.

Usage of strtol on std::string

I recently migrated from C to C++, and there's a little confusion about strings. Strings just aren't what they used to be any more, as in, not just char arrays with a terminating '\0'.
I haven't found a real answer to this question, so how far can you treat the std::string class like C-Strings?
For example: If I know there's a number somewhere in a string, let the string be ireallylike314, in C I could use strtol(string + 10, NULL, 10) to just get that number.
And, if this doesn't work, is there a way to use std::string like C-strings?
Use c_str().
strtol(string.c_str() + 10, NULL, 10);
If you want to get C-style string from std::string, then as mentioned use c_str() method. But another solution to this specific problem would be just using stol instead of strtol.
While stol doesn't (in itself) support what you want, I think I'd use it in conjunction with substr to get the required result:
std::string in = "ireallylike314";
// extract number and print it out multiplied by 2 to show we got a number
std::cout << 2 * stol(in.substr(11));
Result:
628
This has both good and bad points though. On the bad side, it creates a whole new string object to hold the digits out of the input string. On the good side, it gives a little more control over the number of digits to convert, so if (for example) you only wanted to convert the first two digits from the string (even if, as in this case, they're followed by more digits) you can do that pretty easily too:
std::cout << 2 * stol(in.substr(11, 2));
Result:
62
In quite a few cases, the degree to which this is likely to be practical for you will depend heavily upon whether your implementation includes the short string optimization. If it does, creating a (small) string is often cheap enough to make this perfectly reasonable. If it doesn't, the heap allocation to create the temporary string object as the return value from substr may be a higher price than you want to pay.
The C-like way:
long n = std::strtol( string.c_str() + offset, nullptr, 10 );
// sets ERRNO on error and returns value by saturating arithmetic.
The Java-ish way:
long n = std::stol( string.substr( offset, std::string::npos ) );
// exception (no return value) and perhaps ERRNO is set on error.
The streams way:
long n = 0;
std::istringstream( string ).ignore( offset ) >> n;
// n is unmodified on error
The locales way:
long n = 0;
std::ios_base fmt; // Use default formatting: base-10 only.
std::ios::iostate err = {};
std::num_get< char, std::string::iterator >()
.get( string.begin() + offset, string.end(), fmt, err, n );
// err is set to std::ios::failbit on error
This is maybe beyond the scope of the question but since you are migrating to C++ and you seem confused about std::string, you'll likely find the following useful.
The point of having std::string is not to use it like C-Strings (ofc you can do it, like the previous answers showed). You can take a lot more advantages of std::string capabilities. For example it is a C++ container, there are functions to get substrings, to compare strings, etc ...
String manipultions are generally a lot easier with std::string rather than C-Strings.
See for example http://www.cplusplus.com/reference/string/string/ for its capabilities.
Strings just aren't what they used to be any more, as in, not just
char arrays with a terminating '\0'.
You are wrong. In C++ strings are defined the same way. In both languages strings are defined like
A string is a contiguous sequence of characters terminated by and
including the first null character.
You mix strings with class std::string (or std::basic_string) that are not the same.
For example: If I know there's a number somewhere in a string, let the
string be ireallylike314, in C I could use strtol(string[10], NULL,
10) to just get that number
You are mistaken. The valid function call will look like
strtol( &string[11], NULL, 10)
or
strtol( string + 11, NULL, 10)
The same function you can call for an object of class std::string using member function c_str() or (starting from C++ 2011) data()
For example
std::string s( "ireallylike314" );
auto x = std::strtol( s.c_str() + 11, NULL, 10 );
or
auto x = std::strtol( s.data() + 11, NULL, 10 );

Clean Way to Convert Python 3 Unicode to std::string

I wrap a lot of C++ using the Python 2 API (I can't use things like swig or boost.python for various technical reasons). When I have to pass a string (usually a path, always ASCII) into C/C++, I use something like this:
std::string file_name = PyString_AsString(py_file_name);
if (PyErr_Occurred()) return NULL;
Now I'm considering updating to Python 3, where PyString_* methods don't exist. I found one solution that says I should do something like this:
PyObject* bytes = PyUnicode_AsUTF8String(py_file_name);
std::string file_name = PyBytes_AsString(bytes);
if (PyErr_Occurred()) return NULL;
Py_DECREF(bytes);
However this is twice as many lines and seems a bit ugly (not to mention that it could introduce a memory leak if I forget the last line).
The other option is to redefine the python functions to operate on bytes objects, and to call them like this
def some_function(path_name):
_some_function(path_name.encode('utf8'))
This isn't terrible, but it does require a python-side wrapper for each function.
Is there some cleaner way to deal with this?
Looks like the solution exists in python 3.3, with char* PyUnicode_AsUTF8(PyObject* unicode). This should be exactly the same behavior as the PyString_AsString() function from python 2.
If you know (and of course, you could check with an assert or similar) that it's all ASCII, then you could simply create it like this:
std::string py_string_to_std_string(PyUnicode_string py_file_name)
{
len = length of py_file_name; // Not sure how you write that in python.
std::string str(len);
for(int i = 0; i < len; i++)
str += py_file_name[i];
return str;
}
Providing an improved version of accepted answer, instead of using PyUnicode_AsUTF8(...) better to use PyUnicode_AsUTF8AndSize(...).
Becasue string may contain null character (0 codepoint) somewhere in the middle, then your resulting std::string will contain truncated version of full string if you use PyUnicode_AsUTF8(...).
Py_ssize_t size = 0;
char const * pc = PyUnicode_AsUTF8AndSize(obj, &size);
std::string s;
if (pc)
s = std::string(pc, size);
else
// Error, handle!

Char to Int in C++? [duplicate]

This question already has answers here:
Closed 11 years ago.
Possible Duplicate:
How to convert a single char into an int
Well, I'm doing a basic program, wich handles some input like:
2+2
So, I need to add 2 + 2.
I did something like:
string mys "2+2";
fir = mys[0];
sec = mys[2];
But now I want to add "fir" to "sec", so I need to convert them to Int.
I tried "int(fir)" but didn't worked.
There are mulitple ways of converting a string to an int.
Solution 1: Using Legacy C functionality
int main()
{
//char hello[5];
//hello = "12345"; --->This wont compile
char hello[] = "12345";
Printf("My number is: %d", atoi(hello));
return 0;
}
Solution 2: Using lexical_cast(Most Appropriate & simplest)
int x = boost::lexical_cast<int>("12345");
Solution 3: Using C++ Streams
std::string hello("123");
std::stringstream str(hello);
int x;
str >> x;
if (!str)
{
// The conversion failed.
}
Alright so first a little backround on why what you attempted didn't work. In your example, fir is declared as a string. When you attempted to do int(fir), which is the same as (int)fir, you attempted a c-style cast from a string to an integer. Essentially you will get garbage because a c-style cast in c++ will run through all of the available casts and take the first one that works. At best your going to get the memory value that represents the character 2, which is dependent upon the character encoding your using (UTF-8, ascii etc...). For instance, if fir contained "2", then you might possibly get 0x32 as your integer value (assuming ascii). You should really never use c-style casts, and the only place where it's really safe to use them are conversions between numeric types.
If your given a string like the one in your example, first you should separate the string into the relevant sequences of characters (tokens) using a function like strtok. In this simple example that would be "2", "+" and "2". Once you've done that you can simple call a function such as atoi on the strings you want converted to integers.
Example:
string str = "2";
int i = atoi(str.c_str()); //value of 2
However, this will get slightly more complicated if you want to be able to handle non-integer numbers as well. In that case, your best bet is to separate on the operand (+ - / * etc), and then do a find on the numeric strings for a decimal point. If you find one you can treat it as a double and use the function atof instead of atoi, and if you don't, just stick with atoi.
Have you tried atoi or boost lexical cast?