boost::lexical_cast throwing exception during converting string to int8_t, but int32_t - norm.
What can be wrong with int8_t ?
#include <iostream>
#include <cstdlib>
#include <boost/lexical_cast.hpp>
int main()
{
try
{
const auto a = boost::lexical_cast<int8_t>("22");
std::cout << a << std::endl;
}
catch( std::exception &e )
{
std::cout << "e=" << e.what() << std::endl;
}
}
For boost::lexical_cast, the character type of the underlying stream is assumed to be char unless either the Source or the Target requires wide-character streaming, in which case the underlying stream uses wchar_t. Following types also can use char16_t or char32_t for wide-character streaming
Boost Lexical Cast
So after doing below changes in your code:
#include <iostream>
#include <cstdlib>
#include <boost/lexical_cast.hpp>
int main()
{
try
{
const auto a = boost::lexical_cast<int8_t>("2");
const auto b = boost::lexical_cast<int16_t>("22");
std::cout << a << " and "<< b << std::endl;
}
catch( std::exception &e )
{
std::cout << "e=" << e.what() << std::endl;
}
return 0;
}
Gives below output
2 and 22
So, I feel that each character is taken as char.
So, for const auto a = boost::lexical_cast<int16_t>("2"); 2 is taken as single char which require int8_t.
And, for const auto b = boost::lexical_cast<int16_t>("22"); 22 is taken as two char values which require int16_t.
I hope it helps!
Related
#include <iostream>
#include <cstdint>
int main() {
std::uint8_t i{5}; // direct initialization
std::cout << i;
return 0;
}
I could not able to get the value 5 rather I am getting some other.
Why this code gives me some other ASCII value rather than giving value 5?
Use
std::cout << static_cast<int>( i );
The type std::uint8_t is defined as an alias for unsigned char.
Here is a demonstrative program.
#include <iostream>
#include <cstdint>
int main()
{
std::uint8_t i { 65 };
std::cout << i << '\n';
std::cout << static_cast<int>( i ) << '\n';
}
Its output is
A
65
I have this test snippet
#include <boost/any.hpp>
#include <iostream>
#include <vector>
#include <bitset>
#include <string>
class wrapper {
int value;
char character;
std::string str;
public:
wrapper(int i, char c, std::string s) {
value = i;
character = c;
str = s;
}
void get_data(){
std::cout << "Value = " << value << std::endl;
std::cout << "Character = " << character << std::endl;
std::cout << "String= " << str << std::endl;
}
};
int main(){
std::vector<boost::any> container;
container.push_back(10);
container.push_back(1.4);
container.push_back("Mayukh");
container.push_back('A');
container.push_back(std::bitset<16>(255) );
wrapper wrap(20, 'M', "Alisha");
container.push_back(wrap);
std::cout << boost::any_cast<int>(container[0]) << std::endl;
std::cout << boost::any_cast<double>(container[1]) << std::endl;
std::cout << boost::any_cast<std::string>(container[2]);
std::cout << boost::any_cast<char>(container[3]) << std::endl;
std::cout << boost::any_cast<std::bitset<16>>(container[4]);
auto p = boost::any_cast<wrapper>(container[5]);
p.get_data();
return 0;
}
In this boost::any_cast gives bad_casting exception for std::string. It means for some reason it is not able to typecast boost::any into std::string. While other classes like bitset or my own user defined class is working. Can you please tell me why & a way out of this?
"Mayukh" is not a std::string, it is a const array of 7 characters {'M', 'a', 'y', 'u', 'k', 'h', '\0'}. In C++14, "Mayukh"s is a std::string after using namespace std::literals::string_literals;.
In C++11, std::string("Mayukh") is a std::string as well.
boost::any only supports converting back to the exact same type (well, up to some decay/const/etc). It does not support conversions between the types. See boost any documentation:
Discriminated types that contain values of different types but do not attempt conversion between them, i.e. 5 is held strictly as an int and is not implicitly convertible either to "5" or to 5.0. Their indifference to interpretation but awareness of type effectively makes them safe, generic containers of single values, with no scope for surprises from ambiguous conversions.
Augmenting any with extra smart conversions can be done. For example, a pseudo-any that takes an incoming type, and possibly auto-converts it (so it won't store shorts: it converts all signed integral types to int64_t and unsigned to uint64_t, it converts "hello" to std::string("hello"), etc) before storing it.
That's because "Mayukh" is not a std::string. It's a const char[7], which would decay into const char*:
boost::any a = "Mayukh";
std::cout << a.type().name() << '\n'; // prints PKc, pointer to const char
if (boost::any_cast<const char*>(&a)) {
std::cout << "yay\n"; // prints yay
}
If you want to be able to use any_cast<std::string>, you'd need to put it in as a std::string:
container.push_back(std::string("Mayukh"));
This is not an answer to the question body but rather to the title to help others who also come here from google:
bool is_char_ptr(const boost::any & operand)
{
try {
boost::any_cast<char *>(operand);
return true;
}
catch (const boost::bad_any_cast &) {
return false;
}
}
std::string any2string(boost::any anything)
{
if (anything.type() == typeid(int)) {
return std::to_string( boost::any_cast<int>(anything) );
}
if (anything.type() == typeid(double)) {
return std::to_string(boost::any_cast<double>(anything));
}
if (is_char_ptr(anything)) {
return std::string(boost::any_cast<char *>(anything));
}
if (boost::any_cast<std::string>(anything)) {
return boost::any_cast<std::string>(anything);
}
}
The last if looks weird but it works because the function is overloaded.
I have a template function that takes an argument of integral type and copies it to a character array on stack with std::snprintf:
static const size_t size = 256;
char buffer[size];
template <class T, std::enable_if<std::is_integral<T>::value, T>::type>
bool to_array(T integer) {
auto ret = std::snprint(buffer, size, "%lld", integer);
return ret > 0;
}
The problem is that if this function is used with int type for example, compiler prints warning, that "%lld" mask reqiures long long int type.
To fix it, I used boost::fusion::map:
using bf = boost::fusion;
using integral_masks = bf::map<
bf::pair<char, const char*>,
bf::pair<short, const char*>,
....
bf::pair<unsigned long long, const char*>
>;
integral_masks masks(
bf::make_pair<char>("%c"),
bf::make_pair<int>("%d"),
....
bf::make_pair<unsigned long>("%lu")
bf::make_pair<unsigned long long>("%llu")
);
auto ret = std::snprint(buffer, size, bf::at_key<T>(masks), integer);
This works, however it looks a bit heavy, and boost::fusion headers increase compile times dramatically. Maybe there is a better and easier way to do it?
Since you're "trying to avoid allocation" and you're using boost anyways: use Boost Iostreams custom devices
PS Lest it's not obvious, by using streams you get all the goodness:
combine with Boost Format if you want printf style or positional argument format strings
combine with Boost Locale for localized messages (gettext) and formatting (ordinals, dates, numerics, ...)
Live On Coliru
#include <array>
#include <boost/iostreams/device/array.hpp>
#include <boost/iostreams/stream.hpp>
#include <iostream>
namespace io = boost::iostreams;
int main()
{
std::array<char, 128> buf;
auto b = buf.begin(), e = buf.end();
io::array_sink as(b, e);
io::stream<io::array_sink> os(as);
os << '1' << uint16_t(42) << uint32_t(42) << std::showbase << std::hex << int64_t(-1) << "\n"
<< std::boolalpha << false << "\n"
<< std::numeric_limits<double>::infinity();
std::cout << "result '" << std::string(b, os.tellp()) << "'\n";
}
This will just stop writing output after buf has been filled.
Realistically, you might just want the back_inserter. That way you get the best of both worlds: control over allocations while not restricting to an arbitray limit.
See also std::string::reserve for further optimizations. You can reuse the string as often as you wish without incurring more allocations.
Live On Coliru
#include <array>
#include <boost/iostreams/device/back_inserter.hpp>
#include <boost/iostreams/stream.hpp>
#include <iostream>
namespace io = boost::iostreams;
int main()
{
std::string buf;
io::stream<io::back_insert_device<std::string> > os(io::back_inserter(buf));
os << '1' << uint16_t(42) << uint32_t(42) << std::showbase << std::hex << int64_t(-1) << "\n"
<< std::boolalpha << false << "\n"
<< std::numeric_limits<double>::infinity();
os.flush();
std::cout << "result '" << buf << "'\n";
}
Both the above uses use Boost Iostreams in header-only mode (no runtime dependency on Boost (shared) libraries).
You may use constexpr function:
constexpr const char* format_of(char) { return "%c"; }
constexpr const char* format_of(int) { return "%d"; }
constexpr const char* format_of(unsigned long) { return "%lu"; }
constexpr const char* format_of(unsigned long long) { return "%llu"; }
Live example
lexical_cast throws an exception in the following case. Is there a way to use lexical_cast and convert the string to integer.
#include <iostream>
#include "boost/lexical_cast.hpp"
#include <string>
int main()
{
std::string src = "124is";
int iNumber = boost::lexical_cast<int>(src);
std::cout << "After conversion " << iNumber << std::endl;
}
I understand, I can use atoi instead of boost::lexical_cast.
If I'm understanding your requirements correctly it seems as though removing the non-numeric elements from the string first before the lexical_cast will solve your problem. The approach I outline here makes use of the isdigit function which will return true if the given char is a digit from 0 to 9.
#include <iostream>
#include "boost/lexical_cast.hpp"
#include <string>
#include <algorithm>
#include <cctype> //for isdigit
struct is_not_digit{
bool operator()(char a) { return !isdigit(a); }
};
int main()
{
std::string src = "124is";
src.erase(std::remove_if(src.begin(),src.end(),is_not_digit()),src.end());
int iNumber = boost::lexical_cast<int>(src);
std::cout << "After conversion " << iNumber << std::endl;
}
The boost/lexical_cast uses stringstream to convert from string to other types,so you must make sure the string can be converted completely! or, it will throw the bad_lexical_cast exception,This is an example:
#include <boost/lexical_cast.hpp>
#include <iostream>
#include <string>
#define ERROR_LEXICAL_CAST 1
int main()
{
using boost::lexical_cast;
int a = 0;
double b = 0.0;
std::string s = "";
int e = 0;
try
{
// ----- string --> int
a = lexical_cast<int>("123");//good
b = lexical_cast<double>("123.12");//good
// -----double to string good
s = lexical_cast<std::string>("123456.7");
// ----- bad
e = lexical_cast<int>("abc");
}
catch(boost::bad_lexical_cast& e)
{
// bad lexical cast: source type value could not be interpreted as target
std::cout << e.what() << std::endl;
return ERROR_LEXICAL_CAST;
}
std::cout << a << std::endl; // cout:123
std::cout << b << std::endl; //cout:123.12
std::cout << s << std::endl; //cout:123456.7
return 0;
}
The following code converts an std::string to int and the problem lies with the fact that it cannot discern from a true integer or just a random string. Is there a systematic method for dealing with such a problem?
#include <cstring>
#include <iostream>
#include <sstream>
int main()
{
std::string str = "H";
int int_value;
std::istringstream ss(str);
ss >> int_value;
std::cout<<int_value<<std::endl;
return 0;
}
EDIT: This is the solution that I liked because it is very minimal and elegant! It doesn't work for negative numbers but I only needed positive ones anyways.
#include <cstring>
#include <iostream>
#include <sstream>
int main()
{
std::string str = "2147483647";
int int_value;
std::istringstream ss(str);
if (ss >> int_value)
std::cout << "Hooray!" << std::endl;
std::cout<<int_value<<std::endl;
str = "-2147483648";
std::istringstream negative_ss(str);
if (ss >> int_value)
std::cout << "Hooray!" << std::endl;
std::cout<<int_value<<std::endl;
return 0;
}
You may try to use Boost lexical_cast, it will throw an exception if the cast failed.
int number;
try
{
number = boost::lexical_cast<int>(str);
}
catch(boost::bad_lexical_cast& e)
{
std::cout << str << "isn't an integer number" << std::endl;
}
EDIT
Accorinding to #chris, You may also try to use std::stoi since C++11. It will throw std::invalid_argument exception if no conversion could be performed. You may find more information here: std::stoi
WhozCraig's approach is much nicer and I wanted to expand on it using the approach that the C++ FAQ uses which is as follows:
#include <iostream>
#include <sstream>
#include <string>
#include <stdexcept>
class BadConversion : public std::runtime_error {
public:
BadConversion(std::string const& s)
: std::runtime_error(s)
{ }
};
inline int convertToInt(std::string const& s,
bool failIfLeftoverChars = true)
{
std::istringstream i(s);
int x;
char c;
if (!(i >> x) || (failIfLeftoverChars && i.get(c)))
throw BadConversion("convertToInt(\"" + s + "\")");
return x;
}
int main()
{
std::cout << convertToInt( "100" ) << std::endl ;
std::cout << convertToInt( "-100" ) << std::endl ;
std::cout << convertToInt( " -100" ) << std::endl ;
std::cout << convertToInt( " -100 ", false ) << std::endl ;
// The next two will fail
std::cout << convertToInt( " -100 ", true ) << std::endl ;
std::cout << convertToInt( "H" ) << std::endl ;
}
This is robust and will know if the conversion fails, you also can optionally choose to fail on left over characters.
/* isdigit example */
#include <stdio.h>
#include <stdlib.h>
#include <ctype.h>
int main ()
{
char str[]="1776ad";
int year;
if (isdigit(str[0]))
{
year = atoi (str);
printf ("The year that followed %d was %d.\n",year,year+1);
}
return 0;
}