boost asio with little endian - c++

I am integrating a library that requires little endian for length. It's formatted with little endian and then a custom serialized object. How do I convert 4 byte char into a int? The little endian tells me the size of the serialized object to read.
so if I receive "\x00\x00\x00H\x00" I would like to be able to get the decimal value out.
my library looks like
char buffer_size[size_desc]
m_socket->receive(boost::asio::buffer(buffer, size_desc));
int converted_int = some_function(buffer); <-- not sure what to do here
char buffer_obj[converted_int];
m_socket->receive(boost::asio::buffer(buffer, size_desc));

For a simple solution you could do couple of tricks,
Reverse with a cast:
// #include <stdafx.h>
#include <cassert>
#include <iomanip>
#include <iostream>
#include <algorithm>
#include <string>
int main()
{
char buff[4] = {3,2,1,0};
std::cout << (*reinterpret_cast<int*>(&buff[0])) << "\n";
std::reverse(buff, buff+4);
std::cout << (*reinterpret_cast<int*>(&buff[0]));
return 0;
};
Boost also comes with an endianness library:
https://www.boost.org/doc/libs/1_74_0/libs/endian/doc/html/endian.html#buffers
You can use the built in types, like:
big_int32_t
little_int16_t

Related

Force UTF-8 handling for std::string in fmt

In my C++17 project, I have a std::string which is known to contain UTF-8 encoded data. Is there any way to force fmt to treat its data as UTF-8 such that this works as expected?
fmt::print("{:-^11}", "あいう");
// should print "----あいう----", currently prints "-あいう-"
UTF-8 handling in {fmt} was recently improved and your example now works with the master branch:
#include <fmt/core.h>
int main() {
fmt::print("{:-^11}", "あいう");
}
prints
----あいう----
Pass field width as the next argument and calculate it yourself:
#include <fmt/format.h>
#include <cstring>
int main() {
fmt::print("{:-^{}}", "あいう", 8 + std::strlen("あいう"));
}

Strange Number Conversion C++

So I have the following code:
#include <iostream>
#include <string>
#include <array>
using namespace std;
int main()
{
array<long, 3> test_vars = { 121, 319225, 15241383936 };
for (long test_var : test_vars) {
cout << test_var << endl;
}
}
In Visual Studio I get this output:
121
319225
-1938485248
The same code executed on the website cpp.sh gave the following output:
121
319225
15241383936
I expect the output to be like the one from cpp.sh. I don't understand the output from Visual Studio. It's probably something simple; but I'd appreciate it nonetheless if someone could tell me what's wrong. It's has become a real source of annoyance to me.
The MSVC uses a 4Byte long. The C++ standard only guarantees long to be at least as large as int. Therefore the max number representable by a signed long is 2.147.483.647. What you input is too large to hold by the long and you will have to use a larger datatype with at least 64bit.
The other compiler used a 64bit wide long which is the reason why it worked there.
You could use int64_t which is defined in cstdint header. Which would guarantee the 64bit size of the signed int.
Your program would read:
#include <cstdint>
#include <iostream>
#include <array>
using namespace std;
int main()
{
array<int64_t, 3> test_vars = { 121, 319225, 15241383936 };
for (int64_t test_var : test_vars) {
cout << test_var << endl;
}
}

Crypto++ 5.6.3rc5 GenerateBlock Not Implemented

I am trying to derive key from password and want to generate randomly the salt(I dont know what size it should be for SHA-256 and does this matter like the IV in AES256, where it should be 128 bit,give a hint if someone know) with AutoSeededRandomPool but exception is cought
RandomNumberGenerator:GenerateBlock Not Implemented
I am using crypto++ 5.6.3rc5 with QT 5.5.1 and /MD release mode, this may be a bug, or unfinished work of someone.
#include <QCoreApplication>
#include <sha.h>
#include <base64.h>
#include <iostream>
#include <string>
#include <pwdbased.h>
#include <cstdio>
#include <iostream>
#include <osrng.h>
using CryptoPP::AutoSeededRandomPool;
#include <iostream>
using std::cout;
using std::cerr;
using std::endl;
#include <string>
using std::string;
#include <cstdlib>
using std::exit;
#include <cryptlib.h>
using CryptoPP::Exception;
#include <hex.h>
using CryptoPP::HexEncoder;
using CryptoPP::HexDecoder;
#include <filters.h>
using CryptoPP::StringSink;
//#include <stdlib.h>
#include <time.h>
int main(int argc, char *argv[])
{
QCoreApplication a(argc, argv);
try
{
AutoSeededRandomPool rng;
byte salt[16*8];
rng.GenerateBlock(salt, 16*8);
byte password[] ="password";
size_t plen = strlen((const char*)password);
size_t slen = strlen((const char*)salt);
int c = 1;
byte derived[32];
CryptoPP::PKCS5_PBKDF2_HMAC<CryptoPP::SHA256> pbkdf2;
pbkdf2.DeriveKey(derived, sizeof(derived), 0, password, plen, salt, slen, c);
string result;
HexEncoder encoder(new StringSink(result));
encoder.Put(derived, sizeof(derived));
encoder.MessageEnd();
cout << "Derived: " << result << endl;
}
catch (const Exception& ex) {
cerr << ex.what() << endl;
}
return a.exec();
}
Crypto++ 5.6.3rc5 GenerateBlock Not Implemented
...
You can read the history on the change at Crash in RandomNumberGenerator::GenerateWord32 due to stack recursion. The change was eventually backed out.
It was fixed in RC6, but it has not been announced yet. There's a quasi-pre-RC6 at Crypto++ 5.6.3 Files. But as soon as it is announced, then its set in stone and will not be changed.
Right now, RC6 is undergoing minor changes due to Cygwin, MinGW and C++11 on Debian Unstable. The changes are not too bad, but testing them is painful. Some of the scripts take half a day to run under emulated platforms, like S/390x.
If you want to side step the issue and avoid the download of pre-RC6, then you can use one of the following generators. They call GenerateIntoBufferedTransformation:
AutoSeededX917RNG< AES >
X917RNG
RandomPool
Or, you can use OS_GenerateRandomBlock to draw directly from the OS's pool.
Or, you can remove the code that throws. Open cryptlib.h, find RandomNumberGenerator, remove the #if 0/#endif guarding the old code and delete the throw.
Also see RandomNumberGenerator on the Crypto++ wiki.

strtoull was not declared in this scope while converting?

I am working with C++ in eclipse CDT and I am trying to convert string to uint64_t by using strtoull but everytime I get below error message -
..\src\HelloTest.cpp:39:42: error: strtoull was not declared in this scope
Below is my C++ example
#include <iostream>
#include <cstring>
#include <string>
using namespace std;
int main() {
string str = "1234567";
uint64_t hashing = strtoull(str, 0, 0);
cout << hashing << endl;
}
return 0;
}
Is there anything wrong I am doing?
Why your solution doesn't work has already been pointed out by others. But there hasn't been a good alternative suggested yet.
Try this for C++03 strtoull usage instead:
#include <string>
#include <cstdlib>
int main()
{
std::string str = "1234";
// Using NULL for second parameter makes the call easier,
// but reduces your chances to recover from error. Check
// the docs for details.
unsigned long long ul = std::strtoull( str.c_str(), NULL, 0 );
}
Or, since C++11, do it directly from std::string via stoull (which is just a wrapper for the above, but saves on one include and one function call in your code):
#include <string>
int main()
{
std::string str = "1234";
// See comment above.
unsigned long long ul = std::stoull( str, nullptr, 0 );
}
Never use char[] or pointers if you have a working alternative. The dark side of C++, they are. Quicker, easier, more seductive. If once you start down the dark path, forever will it dominate your destiny, consume you it will. ;-)
the structure for strtoull is: strtoull(const char *, char * *, int)
You have given it a std::string as pointed out by #juanchopanza
This is the solution I came up with is
#include <iostream>
#include <cstring>
#include <string>
#include <cstdlib>
using namespace std;
int main() {
char str[] = "1234567";
unsigned long long ul;
char* new_pos;
charDoublePointer = 0;
ul = strtoull(str, &new_pos, 0);
cout << ul << endl;
return 0;
}
The output I got was: 1234567
Straight from the eclipse console.
Also at the end of your program you have return 0 out of scope with an extra curly brace.

Formatting the string YYYYMMDD as YYYY.MM.DD using Boost

I have a std::string such as 20040531, I want to format this as 2004.05.31.
Apart from the straight forward way of doing an std::insert at respective locations, is there a better way to do this using Boost?
PS. I cannot use other Boost calls to get date/time as this string is returned via a custom API. So this question is reduced to basic string formatting which may not sound exciting, but I am trying to learn Boost.
You could use boost::format...
#include <string>
#include "boost/format.hpp"
#include <iostream>
int main()
{
std::string a("20040531");
std::cout << boost::format("%1%.%2%.%3%")
% a.substr(0,4) % a.substr(4,2) % a.substr(6,2);
}
You specifically asked about doing this using Boost, but if you wanted to do this in C++ without introducing a dependency on Boost then you could just use a stream to achieve the same thing:
#include <sstream>
#include <string>
#include <iostream>
int main()
{
std::stringstream s;
std::string a("20040531");
s << a.substr(0,4) << '.' << a.substr(4,2) << '.' << a.substr(6,2);
std::cout << s.str();
}