Passing int via winsocks send - c++

How could it be possible to send int (not using third party libraries) via windows sockets Send:
it requires (const char *) as parameter.
My attempt was to send int like this:
unsigned char * serialize_int(unsigned char *buffer, int value)
{
/* Write big-endian int value into buffer; assumes 32-bit int and 8-bit char. */
buffer[0] = value >> 24;
buffer[1] = value >> 16;
buffer[2] = value >> 8;
buffer[3] = value;
return buffer + 4;
}
but Send() wants (const char *). I'm stuck...

Ah, this is easily fixed. The compiler wants a char* (ignore const for now), but you're passing an unsigned char*, and the only real difference is how the compiler interprets each byte when manipulated. Therefore you can easily cast the pointer from unsigned char* to char*. This way:
(char*)serialize_int(...)

const int networkOrder = htonl(value);
const result = send(socket, reinterpret_cast<const char *>(&networkOrder), sizeof(networkOrder), 0);

Related

AES-CBC and SHA-512 hash Encryption with C++ produces odd output

EDIT
This question has been half answered through comments. I was successful in getting the encryption with both AES and SHA to work successfully. The problem with SHA was simple - I was hashing in Java with uppercase hex and C++ with lowercase. AES was successful after changing the type from string to unsigned char and using memcpy instead of strcpy.. I'm still interested in understanding why, after encryption, the result contained the original message in plaintext alongside the binary data - regardless of the type that I was using.
I am currently working on a project in C++ that requires encryption. Normally, I would use Java for this task, however, due to software requirements I have chose C++. After creating an Encryption class with the openssl library, I ran a simple test with AES-CBC 256. The test was a Hello World message encrypted by a hex string key and IV followed by the encrypted result being decrypted. The output below shows the results.
After encryption the binary data contains the original string in plain text as well as the hex value present in the encrypted hex string. After decryption the original hex value for the message is shown in the output as if the process worked.
I am also having problems with creating a SHA-512 hash. Creating a hash in Java differs from the one created in C++. Creating a SHA-256 Hmac hash, however, produces the same output in both languages.
Below is the C++ code I am using in the encryption class.
std::string Encryption::AES::cbc256(const char* data, ssize_t len, const char* key, const char* iv, bool encrypt) {
std::string keyStr = key;
std::string ivStr = iv;
std::string dataStr = data;
std::string _keyStr = Encryption::Utils::fromHex(keyStr.c_str(), 64);
std::string _ivStr = Encryption::Utils::fromHex(ivStr.c_str(), 32);
std::string _dataStr = Encryption::Utils::fromHex(dataStr.c_str(), dataStr.size());
size_t inputLength = len;
char aes_input[_dataStr.size()];
char aes_key[32];
memset(aes_input, 0, _dataStr.size());
memset(aes_key, 0, sizeof(aes_key));
strcpy(aes_input, _dataStr.c_str());
strcpy(aes_key, _keyStr.c_str());
char aes_iv[16];
memset(aes_iv, 0x00, AES_BLOCK_SIZE);
strcpy(aes_iv, _ivStr.c_str());
const size_t encLength = ((inputLength + AES_BLOCK_SIZE) / AES_BLOCK_SIZE);
if(encrypt) {
char res[inputLength];
AES_KEY enc_key;
AES_set_encrypt_key((unsigned char*) aes_key, 256, &enc_key);
AES_cbc_encrypt((unsigned char*) aes_input, (unsigned char *) res, inputLength, &enc_key, (unsigned char *) aes_iv, AES_ENCRYPT);
return Encryption::Utils::toHex((unsigned char *) res, strlen(res));
} else {
char res[inputLength];
AES_KEY enc_key;
AES_set_decrypt_key((unsigned char*) aes_key, 256, &enc_key);
AES_cbc_encrypt((unsigned char*) aes_input, (unsigned char *) res, inputLength, &enc_key, (unsigned char *) aes_iv, AES_DECRYPT);
return Encryption::Utils::toHex((unsigned char *) res, strlen(res));
}
}
std::string Encryption::SHA::hash512(const char *source) {
std::string input = source;
unsigned char hash[64];
SHA512_CTX sha512;
SHA512_Init(&sha512);
SHA512_Update(&sha512, input.c_str(), input.size());
SHA512_Final(hash, &sha512);
std::stringstream ss;
for(int i=0; i<sizeof(hash); i++) {
ss << std::hex << std::setw(2) << std::setfill('0') << (int) hash[i];
}
return ss.str();
}
std::string Encryption::Utils::fromHex(const char* source, ssize_t size) {
int _size = size / 2;
char* dest = new char[_size];
std::string input = source;
int x=0;
int i;
for(i=0;i<_size; i++) {
std::string ret = "";
for(int y=0; y<2; y++) {
ret += input.at(x);
x++;
}
std::stringstream ss;
ss << std::hex << ret;
unsigned int j;
ss >> j;
dest[i] = (char) static_cast<int>(j);
}
return std::string(dest);
}
Can anyone explain to me, or offer their help, as to why I am getting the output I am getting?

Not able to print the char array as thought

#include <fstream>
#include<iostream>
#include<cstring>
using namespace std;
class Address {
public:
char addr[6];
Address() {}
Address(string address) {
size_t pos = address.find(":");
int id = stoi(address.substr(0, pos));
short port = (short)stoi(address.substr(pos + 1, address.size()-pos-1));
memcpy(addr, &id, sizeof(int));
memcpy(&addr[4], &port, sizeof(short));
}
};
enum MsgTypes{
JOINREQ,
JOINREPLY,
DUMMYLASTMSGTYPE,
HEARTBEAT
};
/**
* STRUCT NAME: MessageHdr
*
* DESCRIPTION: Header and content of a message
*/
typedef struct MessageHdr {
enum MsgTypes msgType;
}MessageHdr;
typedef struct en_msg {
// Number of bytes after the class
int size;
// Source node
Address from;
// Destination node
Address to;
}en_msg;
//class Testing{
void send(Address *myaddr, Address *toaddr, char *data, int size);
int main()
{
MessageHdr *msg=new MessageHdr();
size_t msgsize = sizeof(MessageHdr) + sizeof(Address) + sizeof(long) + 1;
msg=(MessageHdr *)malloc(msgsize*sizeof(char));
int id=233;
short port =22;
long heartbeat=1;
msg=(MessageHdr *)malloc(msgsize*sizeof(char));
string s=to_string(id)+":"+to_string(port);
string s1=to_string(id+1)+":"+to_string(port+1);
cout<<s<<'\n';
cout<<s1<<'\n';
Address *addr= new Address(s);
for (int i = 0; i < 6; i++)
cout << addr->addr[i];
Address *toaddr= new Address(s1);
msg->msgType = JOINREQ;
//cout<<(char *)msg->msgType;
memcpy((char *)(msg+1), addr, sizeof(addr));
memcpy((char *)(msg+1) + 1 + sizeof(addr), &heartbeat, sizeof(long));
send(addr, toaddr, (char *)msg, msgsize);
return 0;
}
void send(Address *myaddr, Address *toaddr, char *data, int size) {
cout<<"inside send"<<'\n';
en_msg *em;
//static char temp[2048];
em = (en_msg *)malloc(sizeof(en_msg) + size);
em->size = size;
memcpy(&(em->from), &(myaddr), sizeof(em->from));
memcpy(&(em->to), &(toaddr), sizeof(em->from));
memcpy(em + 1, data, size);
cout<<(char *)(em+1);
}
This is my program,in between I am trying to check the address what is being stored in my char array. but upon printing the array, it gives some strange output. two strange symbols after printing the value of s and s1.
I am trying to store the id:port in the char array of the address class, but looks without success. Please help
The code I am referring to for printing is in the main function. Approx ten lines down the main function.
For say, my id is 233 and port is 22, The address is 233:22 I want to retrieve back 233:22 and print it. How do I do that here?
Thanks in advance :)
The problem is in this line:
cout << addr->addr[i];
Since addr->addr is an array of char, each element will be printed as the character it represents. If you'd rather print the integer value of each, simply cast it to int first.
cout << static_cast<int>(addr->addr[i]); // or old-fashioned: (int)addr->addr[i];
Given the following code:
for (int i = 0; i <= 6; i++)
cout << addr->addr[i];
And given Address's constructor:
size_t pos = address.find(":");
int id = stoi(address.substr(0, pos));
short port = (short)stoi(address.substr(pos + 1, address.size()-pos-1));
memcpy(addr, &id, sizeof(int));
memcpy(&addr[4], &port, sizeof(short));
It's clear that you are printing the bytes that conform a number
addr->addr is a char array which contains two integer variables, one having two bytes (int) and the other having 2 bytes (short).
So, if the number is , lets say, 436, you are printing:
0xB4 0x01 0x00 0x00
<crazy char> SOH NULL NULL
You must understand what are you printing, or what you want to print in order to print it properly.
Note: The most popular setup is assumed here, which means:
Little Endian arquitecture
4-byte int
2-byte short
Update
How to get address and port back:
int address;
unsigned short port;
memset(&address, addr->addr, 4);
memset(&port, addr->addr+4, 2);

QT socket read losing bytes

I'm trying to work with length-preceded TCP messages using Qt. I have following method:
QByteArray con::read()
{
QByteArray s;
s = _pSocket->read(4);
if (s.length() == 4) {
int size = char_to_int32(s);
s = _pSocket->read(size);
}
return s;
}
Well, it does not work. Looks like I lose all data after reading first 4 bytes: the first read works fine, but read(size) returns nothing. Is there a way to solve this?
The char_to_int32 is:
int char_to_int32(QByteArray s)
{
int size = 0;
size |= (s.at(0) << 24);
size |= (s.at(1) << 16);
size |= (s.at(2) << 8);
size |= (s.at(3));
return size;
}
EDIT :
The sending function (plain C):
int send(int connfd, const unsigned char* message, unsigned int size) {
int c;
unsigned char* bytes = (unsigned char*) malloc(4 + size);
int32_to_char(size, bytes); // converts message size to 4 bytes
memcpy(bytes + 4, message, size);
c = write(connfd, bytes, 4 + size);
free(bytes);
if (c <= 0)
return -1;
else
return 0;
}
By the way, when I call _pSocket->readAll(), the entire packet is read, including 4-byte size and message itself.
EDIT :
void int32_to_char(uint32_t in, char* bytes) {
bytes[0] = (in >> 24) & 0xFF;
bytes[1] = (in >> 16) & 0xFF;
bytes[2] = (in >> 8) & 0xFF;
bytes[3] = in & 0xFF;
return;
}
As you are using the QByteArray QIODevice::read(qint64 maxSize) function, you may not be detecting errors correctly:
This function has no way of reporting errors; returning an empty QByteArray() can mean either that no data was currently available for reading, or that an error occurred.
Some things to try:
Use the qint64 QIODevice::read(char* data, qint64 maxSize) which reports errors:
If an error occurs ... this function returns -1.
Call QIODevice::errorString and QAbstractSocket::error to find out what is going wrong.
For bonus points, listen to the QAbstractSocket::error error signal.
If this is a new protocol you are creating, try using QDataStream for serialization, this automatically handles length prefixs and is platform independent. Your char_to_int32 will break if you mix platforms with different endienness, and may break between different OSs or compilers as int is not guaranteed to be 32 bits (it is defined as at least 16 bits).
If you can't use QDataStream, at least use the htons, ntohs ... functions.
Edit
Here is some example code showing hton/ntoh usage. Note that uint32_t and not int is used as it's guaranteed to be 32 bits. I've also used memcpy rather than pointer casts in the encode/decode to prevent aliasing and alignment problems (I've just done a cast in the test function for brevity).
#include <stdio.h>
#include <string.h>
#include <arpa/inet.h>
void encode(uint32_t in, char* out)
{
/* Host to Network long (32 bits) */
const uint32_t t = htonl(in);
memcpy(out, &t, sizeof(t));
}
uint32_t decode(char* in)
{
uint32_t t;
memcpy(&t, in, sizeof(t));
/* Network to Host long (32 bits) */
return ntohl(t);
}
void test(uint32_t v)
{
char buffer[4];
printf("Host Input: %08x\n", v);
encode(v, buffer);
printf("Network: %08x\n", *((uint32_t*)buffer));
printf("Host Output: %08x\n\n", decode(buffer));
}
int main(int argc, char** argv)
{
test(0);
test(1);
test(0x55);
test(0x55000000);
return 0;
}

simulate ulltoa() with a radix/base of 36

I need to convert an unsigned 64-bit integer into a string. That is in Base 36, or characters 0-Z. ulltoa does not exist in the Linux manpages. But sprintf DOES. How do I use sprintf to achieve the desired result? i.e. what formatting % stuff?
Or if snprintf does not work, then how do I do this?
You can always just write your own conversion function. The following idea is stolen from heavily inspired by this fine answer:
char * int2base36(unsigned int n, char * buf, size_t buflen)
{
static const char digits[] = "0123456789ABCDEFGHI...";
if (buflen < 1) return NULL; // buffer too small!
char * b = buf + buflen;
*--b = 0;
do {
if (b == buf) return NULL; // buffer too small!
*--b = digits[n % 36];
n /= 36;
} while(n);
return b;
}
This will return a pointer to a null-terminated string containing the base36-representation of n, placed in a buffer that you provide. Usage:
char buf[100];
std::cout << int2base36(37, buf, 100);
If you want and you're single-threaded, you can also make the char buffer static -- I guess you can figure out a suitable maximal length:
char * int2base36_not_threadsafe(unsigned int n)
{
static char buf[128];
static const size_t buflen = 128;
// rest as above

Send other data types in winsock2

The send function in winsock2 accepts only char pointers.
How do I send integers or objects through it too?
const char *buf which you need to pass to send() function as an argument is just a pointer to array of bytes. You need to convert integers to bytes:
const int MAX_BUF_SIZE = 1024;
int int_data = 4;
const char *str_data = "test";
char *buf = (char*) malloc(MAX_BUF_SIZE);
char *p = buf;
memcpy(&int_data, p, sizeof(int_data));
p += sizeof(int_data);
strcpy(p, str_data);
p += strlen(str_data) + 1;
send(sock, buf, p - buf, 0);
free(buf);
and reading code:
const int MAX_BUF_SIZE = 1024;
int int_data = 0;
const char *str_data = NULL;
char *buf = (char*) malloc(MAX_BUF_SIZE);
char *p = buf;
recv(sock, buf, MAX_BUF_SIZE, 0);
memcpy(p, &int_data, sizeof(int_data));
p += sizeof(int_data);
str_data = malloc(strlen(p) + 1);
strcpy(str_data, p);
p += strlen(p) + 1;
free(buf);
and complex objects needs to be serialized to stream of bytes.
Note 1: The code sample is valid iff both server and client use the same platforms (x32 / x64 / ...) that means int has the same amount of bytes and byte order is the same.
Note 2: Writing code should check that there is no buffer (MAX_BUF_SIZE) overflow on each step.
Just store the value into a variable and then type-cast the variable to char*. The send() and recv() functions operate on binary data, despite taking char* parameters.
Sending:
int int_data = 4;
send(sock, (char*) &int_data, sizeof(int), 0);
Reading:
int int_data;
recv(sock, (char*) &int_data, sizeof(int), 0);
Generally, the easiest way is to print the integer or object to a string, and send that string. Textual representations are more portable, and also easier to debug.
std::stringstream may be a useful class both to create the string and parse it on the other end.