weird gcc behavior with unsigned ints - c++

#include <iostream>
#include <cstdint>
#include <cstdio>
using namespace std;
int main()
{
uint16_t ii;
std::cin >> ii;
printf("%d\n", ii);
}
When I give input 5 the output is also 5. But when I change the type of ii to uint8_t, I do not get 5 but 53 which seems to be the ASCII value of 5. Is this expected?

uint8_t is allowed (but not required) to be a typedef for char (if it happens to be unsigned) or unsigned char. And input of those is done as characters not numbers. So this is valid but not required behaviour.

Related

Filling a vector of bytes with random bytes

I wanted to fill a std::vector<BYTE> with random or pseudo-random bytes of data. I have written (in other word, find it) the following source code in the stackoverflow but it doesn't compile in my Visual Studio.
#include <Windows.h>
#include <vector>
#include <random>
#include <climits>
#include <algorithm>
#include <functional>
using random_bytes_engine = std::independent_bits_engine<std::default_random_engine, CHAR_BIT, BYTE>;
int main()
{
random_bytes_engine rbe;
std::vector<BYTE> data(1000);
std::generate(data.begin(), data.end(), std::ref(rbe));
}
When I try to compile the above code Visual studio give me the following errors:
Error C2338 note: char, signed char, unsigned char, char8_t, int8_t,
and uint8_t are not allowed Messaging
Error C2338 invalid template argument for independent_bits_engine:
N4659 29.6.1.1 [rand.req.genl]/1f requires one of unsigned short,
unsigned int, unsigned long, or unsigned long long Messaging.
The BYTE type, which is just an alias for unsigned char, is not an allowed type for UIntType parameter of
template<class Engine, std::size_t W, class UIntType>
class independent_bits_engine;
The standard, [rand.req.genl]/1.f, reads:
Throughout this subclause [rand], the effect of instantiating a template:
...
that has a template type parameter named UIntType is undefined unless the corresponding template argument is cv-unqualified and is one of unsigned short, unsigned int, unsigned long, or unsigned long long.
The answer from Evg is correct.
If you really want to have random bytes only, I would use a custom generator function that generates values between [-128, 127] or any desired range.
For instance:
#include <iostream>
#include <Windows.h>
#include <vector>
#include <random>
#include <algorithm>
#include <limits>
int main()
{
std::random_device r;
std::default_random_engine randomEngine(r());
std::uniform_int_distribution<int> uniformDist(CHAR_MIN, CHAR_MAX);
std::vector<BYTE> data(1000);
std::generate(data.begin(), data.end(), [&uniformDist, &randomEngine] () {
return (BYTE) uniformDist(randomEngine);
});
for (auto i : data) {
std::cout << int(i) << std::endl;
}
return 0;
}
References:
https://en.cppreference.com/w/cpp/numeric/random
https://en.cppreference.com/w/cpp/algorithm/generate
Just do this instead:
using random_bytes_engine = std::independent_bits_engine<std::default_random_engine, 32, uint32_t>;
Turns the engine into a 32-bit random number generator, but using it to initialize a vector of BYTEs works just fine.

How do I set bitset for a variable with the an integer name

I want to set the bitset of the char '0' equal to 0101010101 but when I try I get the error "expected an identifier"
#include <iostream>
#include <string>
#include <bitset>
using namespace std;
int main() {
bitset<8> '0'=0101010101;
}
I've also tried
bitset <8> 0(string("0101010101"));
but I get the same error
You can use an unordered_map to set one to one mapping between an int and a bitset. The length of the sample 0101010101 is 10 so size of bitset will be 10 and 0101010101 = 341 in decimal.
#include <iostream>
#include <unordered_map>
#include <bitset>
std::unordered_map<int, std::bitset<10>> M {
{0, 341},
{1, ...},
...
...
...
};
int main()
{
std::cout << M[0] << std::endl;
}
0 is an int literal and '0' is a char literal, neither of which are variable names.
You can use _0 as variable name. Or even better use a name that describes what the variable is used for.

Strange Number Conversion C++

So I have the following code:
#include <iostream>
#include <string>
#include <array>
using namespace std;
int main()
{
array<long, 3> test_vars = { 121, 319225, 15241383936 };
for (long test_var : test_vars) {
cout << test_var << endl;
}
}
In Visual Studio I get this output:
121
319225
-1938485248
The same code executed on the website cpp.sh gave the following output:
121
319225
15241383936
I expect the output to be like the one from cpp.sh. I don't understand the output from Visual Studio. It's probably something simple; but I'd appreciate it nonetheless if someone could tell me what's wrong. It's has become a real source of annoyance to me.
The MSVC uses a 4Byte long. The C++ standard only guarantees long to be at least as large as int. Therefore the max number representable by a signed long is 2.147.483.647. What you input is too large to hold by the long and you will have to use a larger datatype with at least 64bit.
The other compiler used a 64bit wide long which is the reason why it worked there.
You could use int64_t which is defined in cstdint header. Which would guarantee the 64bit size of the signed int.
Your program would read:
#include <cstdint>
#include <iostream>
#include <array>
using namespace std;
int main()
{
array<int64_t, 3> test_vars = { 121, 319225, 15241383936 };
for (int64_t test_var : test_vars) {
cout << test_var << endl;
}
}

strtoull was not declared in this scope while converting?

I am working with C++ in eclipse CDT and I am trying to convert string to uint64_t by using strtoull but everytime I get below error message -
..\src\HelloTest.cpp:39:42: error: strtoull was not declared in this scope
Below is my C++ example
#include <iostream>
#include <cstring>
#include <string>
using namespace std;
int main() {
string str = "1234567";
uint64_t hashing = strtoull(str, 0, 0);
cout << hashing << endl;
}
return 0;
}
Is there anything wrong I am doing?
Why your solution doesn't work has already been pointed out by others. But there hasn't been a good alternative suggested yet.
Try this for C++03 strtoull usage instead:
#include <string>
#include <cstdlib>
int main()
{
std::string str = "1234";
// Using NULL for second parameter makes the call easier,
// but reduces your chances to recover from error. Check
// the docs for details.
unsigned long long ul = std::strtoull( str.c_str(), NULL, 0 );
}
Or, since C++11, do it directly from std::string via stoull (which is just a wrapper for the above, but saves on one include and one function call in your code):
#include <string>
int main()
{
std::string str = "1234";
// See comment above.
unsigned long long ul = std::stoull( str, nullptr, 0 );
}
Never use char[] or pointers if you have a working alternative. The dark side of C++, they are. Quicker, easier, more seductive. If once you start down the dark path, forever will it dominate your destiny, consume you it will. ;-)
the structure for strtoull is: strtoull(const char *, char * *, int)
You have given it a std::string as pointed out by #juanchopanza
This is the solution I came up with is
#include <iostream>
#include <cstring>
#include <string>
#include <cstdlib>
using namespace std;
int main() {
char str[] = "1234567";
unsigned long long ul;
char* new_pos;
charDoublePointer = 0;
ul = strtoull(str, &new_pos, 0);
cout << ul << endl;
return 0;
}
The output I got was: 1234567
Straight from the eclipse console.
Also at the end of your program you have return 0 out of scope with an extra curly brace.

c++ long long int is not enough?? errors

I am working in c++. I have a string that contains the following number
std::string s= "8133522648";
I want to convert this number in
long long int nr;
I did: nr=atoll(s.c_str()). The result is: -456410944. How to solve this error? Thanks
Edit:
In fact I have:
const char* str="8133523648";
I have to convert it into long long int nr=8133523648
Thanks for help! Appreciate!
use int64_t instead of long long. which is defined in stdint.h
If you rely on boost you can use
std::string s= "8133522648";
int64_t nr = boost::lexical_cast<int64_t, std::string>(s);
It can be done in better way as follows:
#include <sstream>
stringstream sstr;
sstr << "8133522648";
long long nr;
sstr >> nr;
Don't use atoll() as it is not defined by C++ standard. Some compiler may implement it while others don't. Also,
std::string s = 8133522648;
doesn't mean
std::string s = "8133522648";
which was probably what you wanted.
Below code is working fine:
#include <iostream>
#include <cstdio>
#include <cstdlib>
using namespace std;
int main() {
std::string s= "8133522648";
long long int nr = atoll(s.c_str());
cout << nr;
}