Sending user defined struct through Apache Qpid - c++

I try to send a user defined struct which contains substructs over AMQP from one node to another. I am using the Apache Qpid library at the moment.
(I'm currently still testing my code on the PC before i rebuild it for my other devices)
my current method consist of a conversion from the struct to a bytestring and sending that over AMQP to deconverse it on the other side.
I do the following
//user defined struct
enum Quality
{
/// <summary>Value is reliable.</summary>
QUALITY_GOOD,
/// <summary>Value not reliable.</summary>
/// <remarks>
/// A variable may be declared unreliable if a sensor is not calibrated or
/// if the last query failed but older samples are still usable.
/// </remarks>
QUALITY_UNCERTAIN,
/// <summary>Value is invalid.</summary>
/// <remarks>
/// A variable may be declared bad if the measured value is out of range or
/// if a timeout occured.
/// </remarks>
QUALITY_BAD
};
struct Payload
{
/// <summary>Identifier that uniquely points to a single instance.</summary>
DdsInterface::Id id = DdsInterface::Id();
/// <summary>Human readable short name.</summary>
std::string name = "default";
/// <summary>Actual value.</summary>
long long value;
/// <summary>Quality of the Value.</summary>
Quality quality = QUALITY_GOOD;
/// <summary>Detailed quality of the variable.</summary>
QualityDetail qualityDetail = 0;
/// <summary>Unit of measure.</summary>
PhysicalQuantity quantity = 0;
Payload();
Payload(const DdsInterface::Id id, const std::string topic, const uint64_t counter);
};
//sender function
void QpidAMQP::AMQPPublish(const Payload& payload, bool durability, bool sync)
{
// Publish to MQTT broker
qpid::messaging::Message message;
message.setDurable(durability);
char b[sizeof (payload)];
memcpy(b, &payload, sizeof(payload));
//create stream of bytes to send over the line
message.setContent(b);
//message.setContent("testIfSend");
std::string temp = message.getContent();
print_bytes(temp.c_str(), sizeof (temp));// used to check the byte data
this->sender.send(message);
this->session.sync(sync);
}
//receiver functions
void *check_for_incoming_messages(QpidAMQP* amqp_instance) //called via pthread
{
qpid::messaging::Message message;
std::cout << "check for incoming messages" << std::endl;
while(amqp_instance->getReceiver()->fetch(message, qpid::messaging::Duration::FOREVER))
{
amqp_instance->on_message(&message);
}
return nullptr;
}
void QpidAMQP::on_message(qpid::messaging::Message *message)
{
/// make sure message topic and payload are copied!!!
if (this->handler)
{
std::string temp = message->getContent();
print_bytes(temp.c_str(), sizeof (temp)); // used to check the byte data
Payload payload; //Re-make the struct
memcpy(&payload, message->getContent().c_str(), message->getContentSize());
handler->ReceivedIntegerValue(payload.id.variableId, payload.value);
}
}
I did check the byte data and they where vastly different.
sender:
[ 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 63 00 00 00 01 00 00 00 00 00 00 00 00 00 00 00 60 32 bf 74 ff 7f 00 00 05 00 00 00 00 00 00 00 74 6f 70 69 63 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 ]
receiver:
>[ 74 65 73 74 49 66 53 65 6e 64 00 00 00 7f 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 a0 ed ff 43 57 7f 00 00 07 00 00 00 00 00 00 00 64 65 66 61 75 6c 74 00 6d 05 77 4b 57 7f 00 00 ff ff ff ff ff ff ff ff 00 00 00 00 00 00 00 00 ]
I used the following code to print this out
void print_bytes(const void *object, size_t size)
{
// This is for C++; in C just drop the static_cast<>() and assign.
const unsigned char * const bytes = static_cast<const unsigned char *>(object);
size_t i;
printf("[ ");
for(i = 0; i < size; i++)
{
printf("%02x ", bytes[i]);
}
printf("]\n");
}
When i send only a string instead of the payload it receives on the other end. But for some reason a user defined struct doesn't work.
i want to avoid remapping everything against the qpid map because i will lose the depth of my Payload.id.
If someone has any sugestions to overcome this i would appreciatie it.
Thanks in advance,
Nick

I solved the issue.
the problem was that instead of the string instance it copied a pointer instance. By making the std::string name = "default"; into char name[20] = "default"; it copies the real character string.
This is how the publisher and subscriber encode en decode the message now.
void QpidAMQP::AMQPPublish(const Payload& payload, bool durability, bool sync)
{
// Publish to MQTT broker
//create stream of bytes to send over the line
qpid::messaging::Message message;
message.setDurable(durability);
std::string b;
b.resize(sizeof(Payload));
std::memcpy(const_cast<char*>(b.c_str()), &payload, b.size());
message.setContent(b);
std::string temp = message.getContent();
print_bytes(temp.c_str(), temp.size());
this->sender.send(message);
this->session.sync(sync);
}
void QpidAMQP::on_message(qpid::messaging::Message *message)
{
/// make sure message topic and payload are copied!!!
if (this->handler != nullptr)
{
const std::string temp = message->getContent();
print_bytes(temp.c_str(), temp.size());
Payload payload;
std::memcpy(&payload, temp.c_str(), temp.size());//sizeof(message->getContentBytes().c_str()));
handler->ReceivedIntegerValue(payload.id.variableId, payload.value);
}
}

Related

Incomplete binary data between QSignal and QSlot

I have in my Qt code a function f1() that emits a QSignal with a char array of binary data as parameter.
My problem is the QSlot that is connected to this QSignal receives this array but the data is incomplete: it receives the data until the first "0x00" byte.
I tried to change the char [] to char*, didn't help.
How can I do to receive the full data, including the "0x00" bytes ?
connect(dataStream, &BaseConnection::GotPacket, this, &myClass::HandleNewPacket);
void f1()
{
qDebug() << "Binary read = " << inStream.readRawData(logBuffer, static_cast<int>(frmIndex->dataSize));
//logBuffer contains the following hexa bytes: "10 10 01 30 00 00 30 00 00 00 01 00 D2 23 57 A5 38 A2 05 00 E8 03 00 00 6C E9 01 00 00 00 00 00 0B 00 00 00 00 00 00 00 A6 AF 01 00 00 00 00 00"
Q_EMIT GotPacket(logBuffer, frmIndex->dataSize);
}
void myClass::HandleNewPacket(char p[LOG_BUFFER_SIZE], int size)
{
// p contains the following hexa bytes : "10 10 01 30"
}
Thank you.

Displaying Hex codes from buffer after reading from a file [duplicate]

This question already has answers here:
how do I print an unsigned char as hex in c++ using ostream?
(17 answers)
Closed 4 years ago.
I'm trying to store the hex codes read from a file into a buffer and then display it on the console, so far it doesn't seem to work. This is my code:
using namespace std;
int main()
{
ifstream file("Fishie.ch8",ios::binary);
if (!file.is_open())
{
cout << "Error";
}
else
{
file.seekg(0, ios::end);
streamoff size = file.tellg();
file.seekg(0, ios::beg);
char *buffer = new char[size];
file.read(buffer, size);
file.close();
for (int i = 0; i < size; i++)
{
cout <<hex<< buffer[i] << " ";
}
}
delete[] buffer;
cin.get();
}
The expected output should be this:
00 e0 a2 20 62 08 60 f8 70 08 61 10 40 20 12 0e
d1 08 f2 1e 71 08 41 30 12 08 12 10 00 00 00 00
00 00 00 00 00 18 3c 3c 00 00 00 00 00 00 00 00
00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00
3e 3f 3f 3b 39 38 38 38 00 00 80 c1 e7 ff 7e 3c
00 1f ff f9 c0 80 03 03 00 80 e0 f0 78 38 1c 1c
38 38 39 3b 3f 3f 3e 3c 78 fc fe cf 87 03 01 00
00 00 00 00 80 e3 ff 7f 1c 38 38 70 f0 e0 c0 00
3c 18 00 00 00 00 00 00 00 00 00 00 00 00 00 00
00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00
Instead the above output I get some strange looking symbols with lots of empty spaces.
It looks like this:
What could be the problem?
As you buffer is char all elements will be printed as characters. What you want is the number converted to hex.
BTW: As you want a conversion to hexadecimal output, it is a question if you really want to read char from file or unsigned char.
As you find out, the signature for istream.read uses char you have to convert before to unsigned char and than to unsigned int like:
cout <<hex<< (unsigned int)(unsigned char) buffer[i] << " ";
For real c++ users you should write a fine static_cast ;)
This will print out the hex values. But if you have a CR you will see a 'a' instead of '0a', so you have to set your width and fill char before:
cout.width(2);
cout.fill('0');
for (int i = 0; i < size; i++)
{
cout <<hex<< (unsigned int)(unsigned char)buffer[i] << " ";
}
BTW: delete[] buffer; is in wrong scope and must be shifted in the scope where it was defined.

couldn't write specific content into stringstream

I have some sample code reading some binary data from file and then writing the content into stringstream.
#include <sstream>
#include <cstdio>
#include <fstream>
#include <cstdlib>
std::stringstream * raw_data_buffer;
int main()
{
std::ifstream is;
is.open ("1.raw", std::ios::binary );
char * buf = (char *)malloc(40);
is.read(buf, 40);
for (int i = 0; i < 40; i++)
printf("%02X ", buf[i]);
printf("\n");
raw_data_buffer = new std::stringstream("", std::ios_base::app | std::ios_base::out | std::ios_base::in | std::ios_base::binary);
raw_data_buffer -> write(buf, 40);
const char * tmp = raw_data_buffer -> str().c_str();
for (int i = 0; i < 40; i++)
printf("%02X ", tmp[i]);
printf("\n");
delete raw_data_buffer;
return 0;
}
With a specific input file I have, the program doesn't function correctly. You could download the test file here.
So the problem is, I write the file content into raw_data_buffer and immediately read it back, and the content differs. The program's output is:
FFFFFFC0 65 59 01 00 00 00 00 00 00 00 00 00 00 00 00 FFFFFFE0 0A 40 00 00 00 00 00 FFFFFF80 08 40 00 00 00 00 00 70 FFFFFFA6 57 6E FFFFFFFF 7F 00 00
00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 FFFFFFE0 0A 40 00 00 00 00 00 FFFFFF80 08 40 00 00 00 00 00 70 FFFFFFA6 57 6E FFFFFFFF 7F 00 00
The content FFFFFFC0 65 59 01 is overwritten with 0. Why so?
I suspect this a symptom of undefined behavior from using deallocated memory. You're getting a copy of the string from the stringstream but you're only grabbing a raw pointer to the internals that is then immediately deleted. (the link actually warns against this exact case)
const char* tmp = raw_data_buffer->str().c_str();
// ^^^^^ returns a temporary that is destroyed
// at the end of this statement
// ^^^ now a dangling pointer
Any use of tmp would exhibit undefined behavior and could easily cause the problem you're seeing. Keep the result of str() in scope.

CLR have fat or small exception frame?

How detect IMAGE_COR_ILMETHOD_SECT_EH must use Small or Fat?
Also me instrest other internal CLR structure/opcode details. Answer below answers this and many other questions.
/*RVA:0*/ typedef union IMAGE_COR_ILMETHOD{
IMAGE_COR_ILMETHOD_TINY Tiny;
IMAGE_COR_ILMETHOD_FAT Fat;} IMAGE_COR_ILMETHOD;
/*PC = RVA + sizeof( IMAGE_COR_ILMETHOD) = 12 or 4 byte*/ ... Code
/*EH = PC+CodeSize */typedef union IMAGE_COR_ILMETHOD_SECT_EH{
IMAGE_COR_ILMETHOD_SECT_EH_SMALL Small;
IMAGE_COR_ILMETHOD_SECT_EH_FAT Fat;
} IMAGE_COR_ILMETHOD_SECT_EH;
https://github.com/dotnet/coreclr/blob/master/src/inc/corhdr.h
for example
public static Main(string args[]){
int i=0;
try{
Console.Write("OK");
} catch(Exception){
i++
}
0000 4D 5A 90 00 MZ-header
0250 2A 02 17 8C 06 00 00 01 51 2a 00
RVA: 1B 30 02 00 // IMAGE_COR_ILMETHOD_FAT
1D 00 00 00 CodeSize= 29
01 00 00 11 Locals = 11000001
PC0: 00 16 0A i=0
PC3 00 72 01 00 00 70 try{
28 04 00 00 0A call Console.Write
00 00 DE 09
PC12:26 00 06 17 58 0A 00 DE 00 00 2A (2A is ret command)
00 00 00 01 10 00 // IMAGE_COR_ILMETHOD_SECT_EH ??? 1=count
00 00 CorExceptionFlag Flags
03 00 TryOffset
0F TryLength
12 00 HandlerOffset
09 HandlerLength
08 00 00 01 ClassToken
In this case we have small EH-frame. How detect we have small or fat frame?
struct IMAGE_COR_ILMETHOD_SECT_EH_CLAUSE_SMALL{
CorExceptionFlag Flags : 16;
unsigned TryOffset : 16;
unsigned TryLength : 8; // relative to start of try block
unsigned HandlerOffset : 16;
unsigned HandlerLength : 8; // relative to start of handler
union {
DWORD ClassToken;
DWORD FilterOffset;
};
} IMAGE_COR_ILMETHOD_SECT_EH_CLAUSE_SMALL;
typedef struct IMAGE_COR_ILMETHOD_SECT_EH_CLAUSE_FAT
{
CorExceptionFlag Flags;
DWORD TryOffset;
DWORD TryLength; // relative to start of try block
DWORD HandlerOffset;
DWORD HandlerLength; // relative to start of handler
union {
DWORD ClassToken; // use for type-based exception handlers
DWORD FilterOffset; // use for filter-based exception handlers (COR_ILEXCEPTION_FILTER is set)
};
} IMAGE_COR_ILMETHOD_SECT_EH_CLAUSE_FAT;
This is covered in partition II section 25.4.5 of ECMA-335.
If CorILMethod_Sect_FatFormat bit (0x40) is set in the Kind field (the first byte of the structure) then you should use fat, otherwise small. The Kind field can be accessed via Small.SectSmall.Kind or Fat.SectFat.Kind, either should work.

Inflating TMX Data using Base64 & Zlib - C++

I have searched the web for an way to convert the TMX Data into some sort of usable data but I cannot seem to use Zlib to inflate the data I get back from a Base64 Decode function. I'm unaware if that's how it works, but from what I looked around and I'm guessing That I am supposed to Deflate the code, then inflate it with Zlib.
So: TMX Data -> Base64 -> Decode -> Decoded Data -> Zlib -> Inflate -> Usable Data?
Here's my source code:
const std::string EncryptedString = "eJxjZGBgYMSCZYCYHYilccgPNnVqOLAQmjp2PFgPiJmh6iSBWApKI7OlkNTQAgMA4AIDoQ==";
FILE *wfile;
// Will contain decoded data
wfile = fopen("testFile", "w");
fprintf(wfile, base64_decode(EncryptedString).c_str());
Then I open the same file with the decoded data, which is:
xœcd```Ä‚e€˜ˆ¥qÈ6uj8°š:v<Xˆ™¡ê$X
J#³¥ÔÐ
And try to inflate it with Zlib using the Zlib inflate function in the doc's
FILE *source;
// Contains decoded data.
source = fopen("testFile", "r");
FILE *dest;
// We write decompressed data to this file.
dest = fopen("testOutFile", "w");
zerr(Z_Inflate(source, dest));
Yet Zlib returns an error message of "Invalid or incomplete deflate data"
Here's the code for the Zlib Function:
inline int Z_Inflate(FILE *source, FILE *dest)
{
int ret;
unsigned have;
z_stream strm;
Bytef in[CHUNK];
Bytef out[CHUNK];
/* allocate inflate state */
strm.zalloc = Z_NULL;
strm.zfree = Z_NULL;
strm.opaque = Z_NULL;
strm.avail_in = 0;
strm.next_in = Z_NULL;
ret = inflateInit(&strm);
if (ret != Z_OK)
return ret;
/* decompress until deflate stream ends or end of file */
do {
strm.avail_in = fread(in, 1, CHUNK, source);
if (ferror(source)) {
(void)inflateEnd(&strm);
return Z_ERRNO;
}
if (strm.avail_in == 0)
break;
strm.next_in = in;
/* run inflate() on input until output buffer not full */
do {
strm.avail_out = CHUNK;
strm.next_out = out;
ret = inflate(&strm, Z_NO_FLUSH);
assert(ret != Z_STREAM_ERROR); /* state not clobbered */
switch (ret) {
case Z_NEED_DICT:
ret = Z_DATA_ERROR; /* and fall through */
case Z_DATA_ERROR:
case Z_MEM_ERROR:
(void)inflateEnd(&strm);
return ret;
}
have = CHUNK - strm.avail_out;
if (fwrite(out, 1, have, dest) != have || ferror(dest)) {
(void)inflateEnd(&strm);
return Z_ERRNO;
}
} while (strm.avail_out == 0);
/* done when inflate() says it's done */
} while (ret != Z_STREAM_END);
/* clean up and return */
(void)inflateEnd(&strm);
return ret == Z_STREAM_END ? Z_OK : Z_DATA_ERROR;
}
/* report a zlib or i/o error */
inline void zerr(int ret)
{
fputs("zpipe: ", stderr);
switch (ret) {
case Z_ERRNO:
if (ferror(stdin))
fputs("error reading stdin\n", stderr);
if (ferror(stdout))
fputs("error writing stdout\n", stderr);
break;
case Z_STREAM_ERROR:
fputs("invalid compression level\n", stderr);
break;
case Z_DATA_ERROR:
fputs("invalid or incomplete deflate data\n", stderr);
break;
case Z_MEM_ERROR:
fputs("out of memory\n", stderr);
break;
case Z_VERSION_ERROR:
fputs("zlib version mismatch!\n", stderr);
}
}
Any help would be greatly appreciated as I would love to use the tiled editor for my map files. It's seeming to be more of a headache.
Worked for me. After decoding the base64 string, I get in hex:
78 9c 63 64 60 60 60 c4 82 65 80 98 1d 88 a5 71
c8 0f 36 75 6a 38 b0 10 9a 3a 76 3c 58 0f 88 99
a1 ea 24 81 58 0a 4a 23 b3 a5 90 d4 d0 02 03 00
e0 02 03 a1
That is a valid zlib stream that decodes with no errors to this in hex:
01 00 00 00 01 00 00 00 01 00 00 00 01 00 00 00
01 00 00 00 01 00 00 00 1c 00 00 00 07 00 00 00
1b 00 00 00 01 00 00 00 01 00 00 00 01 00 00 00
01 00 00 00 01 00 00 00 01 00 00 00 01 00 00 00
1c 00 00 00 07 00 00 00 1b 00 00 00 01 00 00 00
01 00 00 00 01 00 00 00 01 00 00 00 01 00 00 00
01 00 00 00 01 00 00 00 1c 00 00 00 07 00 00 00
1b 00 00 00 01 00 00 00 01 00 00 00 01 00 00 00
01 00 00 00 01 00 00 00 01 00 00 00 01 00 00 00
1c 00 00 00 07 00 00 00 1b 00 00 00 01 00 00 00
01 00 00 00 01 00 00 00 01 00 00 00 01 00 00 00
01 00 00 00 01 00 00 00 1c 00 00 00 07 00 00 00
1b 00 00 00 01 00 00 00 26 00 00 00 26 00 00 00
26 00 00 00 26 00 00 00 26 00 00 00 26 00 00 00
12 00 00 00 07 00 00 00 1b 00 00 00 01 00 00 00
07 00 00 00 07 00 00 00 07 00 00 00 07 00 00 00
07 00 00 00 07 00 00 00 07 00 00 00 2e 00 00 00
03 00 00 00 01 00 00 00 19 00 00 00 1a 00 00 00
19 00 00 00 19 00 00 00 1a 00 00 00 19 00 00 00
1a 00 00 00 03 00 00 00 01 00 00 00 01 00 00 00
01 00 00 00 01 00 00 00 01 00 00 00 01 00 00 00
01 00 00 00 01 00 00 00 01 00 00 00 01 00 00 00
01 00 00 00 01 00 00 00 01 00 00 00 01 00 00 00
01 00 00 00 01 00 00 00 01 00 00 00 01 00 00 00
01 00 00 00 01 00 00 00 01 00 00 00 01 00 00 00
What kind of machine are you on? If it's Windows (shudder), you may need to make sure that your stdio functions are not trying to do end-of-line conversions. Use fopen(..., "wb") and fopen(..., "rb") for binary writing and reading.