May I know how I can convert std::string, to MSVC specific __int64?
_atoi64, _atoi64_l, _wtoi64, _wtoi64_l
std::string str = "1234";
__int64 v =_atoi64(str.c_str());
See also this link (although it is for linux/unix): Why doesn't C++ reimplement C standard functions with C++ elements/style?
Here's one way:
#include <iostream>
#include <string>
#include <sstream>
using namespace std;
int main() {
string s( "1234567890987654321");
stringstream strm( s);
__int64 x;
strm >> x;
cout << x;
}
__int64, while an extension, is still just a numeric type. Use whichever method you would typically use.
Boost lexical cast is my favorite. It pretty much wraps up Michaels answer in an easy to use form:
__int64 x = boost::lexical_cast<__int64>("3473472936");
If you can't use boost, you can still do a pretty good job of making a simple version. Here's an implementation I wrote for another answer:
template <typename R>
const R lexical_cast(const std::string& s)
{
std::stringstream ss(s);
R result;
if ((ss >> result).fail() || !(ss >> std::ws).eof())
{
throw std::bad_cast();
}
return result;
}
It does some extras, like checking for trailing characters. ("123125asd" would fail). If the cast cannot be made, bad_cast is thrown. (Similar to boost.)
Also, if you have access to boost, you can avoid the need to use the MSVC-specific __int64 extension with:
#include <boost/cstdint.hpp>
typedef boost::int64_t int64;
To get int64 on any platform that provides it, without changing your code.
Related
I'm a complete noob at C++, and the first problem I am encountering is the following:
no operator >> matches these operands
#include "pch.h"
#include <iostream>
#include <string>
using namespace std;
int main()
{
cout << "hello world!";
cin >> "hello world!";
}
std::cin needs to write to a variable, but you are passing it a const char[13] string literal instead.
You need to pass it something like a std::string instead:
std::string str;
std::cin >> str;
P.S. This is a good time to a) read compiler messages, b) avoid using namespace std; globally, c) get a good C++ book.
I am trying to generate boost::uuids::uuid from boost::compute::detail::sha1 in this way:
#include <iostream>
#include "boost/uuid/uuid.hpp"
#include "boost/uuid/uuid_io.hpp"
#include "boost/uuid/string_generator.hpp"
#include "boost/compute/detail/sha1.hpp"
int main(int argc, char* argv[])
{
try
{
boost::compute::detail::sha1 sha1("b888e35f9edf3794760392e1066d69-f43d-452e-8475-a09bae9a2e8500000000-0000-0000-0000-000000000000");
std::string str = sha1;
boost::uuids::uuid uuid = boost::uuids::string_generator()(str); // ERROR HERE!!
}
catch (std::exception& e)
{
std::cerr << "Error occurred: " << e.what() << std::endl;
}
return 0;
}
But this code fails with error Error occurred: invalid uuid string (See above)
I am using Visual Studio 2017, Boost 1.67
Where is my mistake? How to generate boost::uuids::uuid from boost::compute::detail::sha1
PS: That code worked on previous boost versions.
The proper, supported approach to getting a UUID from the SHA1 hash of an arbitrary string is as follows:
#include <string_view>
#include <boost/uuid/uuid.hpp>
#include <boost/uuid/name_generator_sha1.hpp>
boost::uuids::uuid uuid_from_string(std::string_view const input) {
static constexpr boost::uuids::uuid ns_id{}; // †null root, change as necessary
return boost::uuids::name_generator_sha1{ns_id}(input.data(), input.size());
}
Online Demo
(std::string_view is used for exposition; use std::string or char const* or whatever as appropriate if this isn't ideal for you.)
While this is the correct approach, there are two important notes:
As per RFC 4122, you need to provide a namespace for your UUID; basically this is salt for your SHA1 hash. There are predefined namespaces for DNS names, URLs, ISO OIDs, and X.500 distinguished names, but as your input doesn't appear to match any of those you need to define your own; as indicated on the line marked †, the null namespace is used here for exposition. A more detailed explanation of UUID namespaces can be found in this SO answer: Generating v5 UUID. What is name and namespace?
The output from this code will differ entirely from the output from the code in your question; iff you need the output to match the old code you can do the following:
#include <cstring>
#include <string_view>
#include <boost/endian/conversion.hpp>
#include <boost/uuid/uuid.hpp>
#include <boost/uuid/detail/sha1.hpp>
boost::uuids::uuid uuid_from_string_old(std::string_view const input) {
boost::uuids::detail::sha1::digest_type digest;
{
boost::uuids::detail::sha1 h;
h.process_bytes(input.data(), input.size());
h.get_digest(digest);
}
boost::uuids::uuid ret;
auto p = ret.begin();
for (std::size_t i{}; i != 4; p += 4, ++i) {
auto const d = boost::endian::native_to_big(digest[i]);
std::memcpy(p, &d, 4);
}
return ret;
}
Online Demo
This produces the same output, as can be seen from this demo using the old code with Boost 1.65.1. This goes against my commented policy of never directly using someone else's detail namespace, but unless you can find the magic namespace id (if one exists), this is the only approach using Boost that I'm aware of. N.b. this fixes a bug in the old Boost code for big-endian machines; if you need to retain that bug, change the call to boost::endian::native_to_big to invoke boost::endian::endian_reverse instead.
I was attempting to follow the example of Finite State Filters in the Boost::iostreams documentation. However when I went to use the filter I got an error stating the ::imbue was not accessible because 'boost::iostreams::detail::finite_state_filter_impl' uses 'protected' to inherit from 'my_fsm'.
Frustrated I copied my code into the tests used to in the boost examples. The tests compile and pass. My conculsion is that I am probably mis-using the dual use filter defined by:
typedef io::finite_state_filter my_fsm_filter;
I feel that just pushing it onto a filtered_stream may not be proper, but I could not find a missing step. I am sure there must be a need to wrap the filter but I can find no example (though I am sure if I dug deep enough into the code used to test the boost code it has to be there somewhere).
here is a bit of example code:
#include <boost/mpl/vector.hpp>
#include <libs/iostreams/example/finite_state_filter.hpp>
namespace io = boost::iostreams;
struct my_fsm : io::finite_state_machine<my_fsm> {
BOOST_IOSTREAMS_FSM(my_fsm) // define skip and push.
typedef my_fsm self;
static const int beginline = 0;
static const int skipline = 1;
static const int dataline = 2;
typedef boost::mpl::vector <
row<beginline, is<'C'>, skipline, &self::skip>,
row<beginline, is_any, dataline, &self::push>,
row<skipline, is<'\n'>, beginline, &self::skip>,
row<skipline, is_any, skipline, &self::skip>,
row<dataline, is<'\n'>, beginline, &self::push>,
row<dataline, is_any, dataline, &self::push>
> transition_table;
};
typedef io::finite_state_filter<my_fsm> my_fsm_filter;
#include <iostream>
#include <string>
#include <boost/iostreams/device/file.hpp>
#include <boost/iostreams/filtering_stream.hpp>
#include <boost/iostreams/stream.hpp>
namespace io = boost::iostreams;
int main() {
io::stream<io::file_sink> out(io::file_sink("outputfile.txt"));
io::filtering_istream in;
my_fsm_filter infsm;
in.push(my_fsm_filter());
in.push(io::file_source("inputdata.txt"));
while (in) {
std::string line;
if(std::getline(in, line)) {
//std::cout << line << std::endl;
out << line << std::endl;
}
}
return 0;
}
I personally feel that there is a bug in the sample header with respect to this imbue call.
However, you can work around it by changing the typedef to
struct my_fsm_filter : io::finite_state_filter<my_fsm> {
using io::finite_state_filter<my_fsm>::imbue;
};
This explicitly exposes the imbue method as public on the derived type. I haven't looked at the sample program that you reported to be working (because you didn't link to it). But it's possible they used a similar hack.
In my tests, a similar edit to finite_state_filte.hpp L278 to add
using base_type::imbue;
to class finite_state_filter has the same effect.
I've written my own specialisation of each virtual member function of std::ctype<char16_t>, so that this now works:
#include <string>
#include <locale>
#include "char16_facets.h" // Header containing my ctype specialisation
#include <sstream>
#include <iostream>
// Implemented elsewhere using iconv
std::string Convert(std::basic_string<char16_t>);
int main() {
std::basic_string<char16_t> s("Hello, world.");
std::basic_stringstream<char16_t> ss(s);
ss.imbue(std::locale(ss.getloc(), new std::ctype<char16_t>()));
std::basic_string<char16_t> t;
ss >> t;
std::cout << Convert(t) << " ";
ss >> t;
std::cout << Convert(t) << std::endl;
}
Is there a way to make streams use the new ctype specialisation by default, so I don't have to imbue each stream with a new locale?
I haven't written a new class, just provided
template<>
inline bool std::ctype<char16_t>::do_is (std::ctype_base::mask m, char16_t c) const {
etc. I'd sort of hoped it would be picked up automatically, so long as it was declared before I #include <sstream> but it isn't.
Most of the work for the above was done using G++ and libstdc++ 4.8, but I get the same result with them built from SVN trunk.
Edit - Update This question originally asked about how to get number extraction working. However, given a stream imbued with correct ctype and numpunct implementations, then no specialisation of num_get is necessary; simply
ss.imbue(std::locale(ss.getloc(), new std::num_get<char16_t>()));
and it will work, with either gcc version.
Again, is there some way to get the streams to pick this up automatically, rather than having to imbue every stream with it?
Use std::locale::global():
std::locale::global(std::locale(std::locale(), new std::ctype<char16_t>()));
I am working in c++. I have a string that contains the following number
std::string s= "8133522648";
I want to convert this number in
long long int nr;
I did: nr=atoll(s.c_str()). The result is: -456410944. How to solve this error? Thanks
Edit:
In fact I have:
const char* str="8133523648";
I have to convert it into long long int nr=8133523648
Thanks for help! Appreciate!
use int64_t instead of long long. which is defined in stdint.h
If you rely on boost you can use
std::string s= "8133522648";
int64_t nr = boost::lexical_cast<int64_t, std::string>(s);
It can be done in better way as follows:
#include <sstream>
stringstream sstr;
sstr << "8133522648";
long long nr;
sstr >> nr;
Don't use atoll() as it is not defined by C++ standard. Some compiler may implement it while others don't. Also,
std::string s = 8133522648;
doesn't mean
std::string s = "8133522648";
which was probably what you wanted.
Below code is working fine:
#include <iostream>
#include <cstdio>
#include <cstdlib>
using namespace std;
int main() {
std::string s= "8133522648";
long long int nr = atoll(s.c_str());
cout << nr;
}