How to use QDataStream::readBytes() - c++

According to the documentation for readBytes() (in Qt 5.4's QDataStream), I would expect the following code to copy the input_array into newly allocated memory and point raw at the copy:
QByteArray input_array{"\x01\x02\x03\x04qwertyuiop"};
QDataStream unmarshaller{&input_array, QIODevice::ReadOnly};
char* raw;
uint length;
unmarshaller.readBytes(raw, length);
qDebug() << "raw null? " << (raw == nullptr) << " ; length = " << length << endl;
...but the code prints raw null? true ; length = 0, indicating that no bytes were read from the input array.
Why is this? What am I misunderstanding about readBytes()?

The documentation does not describe this clearly enough, but QDataStream::readBytes expects the data to be in a certain format: quint32 part which is the data length and then the data itself.
So to read data using QDataStream::readBytes you should first write it using QDataStream::writeBytes or write it any other way using the proper format.
An example:
QByteArray raw_input = "\x01\x02\x03\x04qwertyuiop";
QByteArray ba;
QDataStream writer(&ba, QIODevice::WriteOnly);
writer.writeBytes(raw_input.constData(), raw_input.length());
QDataStream reader(ba);
char* raw;
uint length;
reader.readBytes(raw, length);
qDebug() << "raw null? " << (raw == NULL) << " ; length = " << length << endl;
Also you can use QDataStream::readRawData and QDataStream::writeRawData to read and write arbitrary data.

Related

C++ Reading back "incorrect" values from binary file?

The project I'm working on, as a custom file format consisting of the header of a few different variables, followed by the pixel data. My colleagues have developed a GUI, where processing, writing reading and displaying this type of file format works fine.
But my problem is, while I have assisted in writing the code for writing data to disk, I cannot myself read this kind of file and get satisfactorily values back. I am able to read the first variable back (char array) but not the following value(s).
So the file format matches the following structure:
typedef struct {
char hxtLabel[8];
u64 hxtVersion;
int motorPositions[9];
int filePrefixLength;
char filePrefix[100];
..
} HxtBuffer;
In the code, I create an object of the above structure and then set these example values:
setLabel("MY_LABEL");
setFormatVersion(3);
setMotorPosition( 2109, 5438, 8767, 1234, 1022, 1033, 1044, 1055, 1066);
setFilePrefixLength(7);
setFilePrefix( string("prefix_"));
setDataTimeStamp( string("000000_000000"));
My code for opening the file:
// Open data file, binary mode, reading
ifstream datFile(aFileName.c_str(), ios::in | ios::binary);
if (!datFile.is_open()) {
cout << "readFile() ERROR: Failed to open file " << aFileName << endl;
return false;
}
// How large is the file?
datFile.seekg(0, datFile.end);
int length = datFile.tellg();
datFile.seekg(0, datFile.beg);
cout << "readFile() file " << setw(70) << aFileName << " is: " << setw(15) << length << " long\n";
// Allocate memory for buffer:
char * buffer = new char[length];
// Read data as one block:
datFile.read(buffer, length);
datFile.close();
/// Looking at the start of the buffer, I should be seeing "MY_LABEL"?
cout << "buffer: " << buffer << " " << *(buffer) << endl;
int* mSSX = reinterpret_cast<int*>(*(buffer+8));
int* mSSY = reinterpret_cast<int*>(&buffer+9);
int* mSSZ = reinterpret_cast<int*>(&buffer+10);
int* mSSROT = reinterpret_cast<int*>(&buffer+11);
int* mTimer = reinterpret_cast<int*>(&buffer+12);
int* mGALX = reinterpret_cast<int*>(&buffer+13);
int* mGALY = reinterpret_cast<int*>(&buffer+14);
int* mGALZ = reinterpret_cast<int*>(&buffer+15);
int* mGALROT = reinterpret_cast<int*>(&buffer+16);
int* filePrefixLength = reinterpret_cast<int*>(&buffer+17);
std::string filePrefix; std::string dataTimeStamp;
// Read file prefix character by character into stringstream object
std::stringstream ss;
char* cPointer = (char *)(buffer+18);
int k;
for(k = 0; k < *filePrefixLength; k++)
{
//read string
char c;
c = *cPointer;
ss << c;
cPointer++;
}
filePrefix = ss.str();
// Read timestamp character by character into stringstream object
std::stringstream timeStampStream;
/// Need not increment cPointer, already pointing # 1st char of timeStamp
for (int l= 0; l < 13; l++)
{
char c;
c = * cPointer;
timeStampStream << c;
}
dataTimeStamp = timeStampStream.str();
cout << 25 << endl;
cout << " mSSX: " << mSSX << " mSSY: " << mSSY << " mSSZ: " << mSSZ;
cout << " mSSROT: " << mSSROT << " mTimer: " << mTimer << " mGALX: " << mGALX;
cout << " mGALY: " << mGALY << " mGALZ: " << mGALZ << " mGALROT: " << mGALROT;
Finally, what I see is here below. I added the 25 just to double check that not everything was coming out in hexadecimal. As you can see, I am able to see the label "MY_LABEL" as expected. But the 9 motorPositions all come out looking suspiciously like addresses are not values. The file prefix and the data timestamp (which should be strings, or at least characters), are just empty.
buffer: MY_LABEL M
25
mSSX: 0000000000000003 mSSY: 00000000001BF618 mSSZ: 00000000001BF620 mSSROT: 00000000001BF628 mTimer: 00000000001BF630 mGALX: 00000000001BF638 mGALY: 00000000001BF640 mGALZ: 00000000001BF648 mGALROT: 00000000001BF650filePrefix: dataTimeStamp:
I'm sure the solution can't be too complicated, but I reached a stage where I had this just spinning and I cannot make sense of things.
Many thanks for reading this somewhat long post.
-- Edit--
I might hit the maximum length allowed for a post, but just in case I thought I shall post the code that generates the data that I'm trying to read back:
bool writePixelOutput(string aOutputPixelFileName) {
// Write pixel histograms out to binary file
ofstream pixelFile;
pixelFile.open(aOutputPixelFileName.c_str(), ios::binary | ios::out | ios::trunc);
if (!pixelFile.is_open()) {
LOG(gLogConfig, logERROR) << "Failed to open output file " << aOutputPixelFileName;
return false;
}
// Write binary file header
string label("MY_LABEL");
pixelFile.write(label.c_str(), label.length());
pixelFile.write((const char*)&mFormatVersion, sizeof(u64));
// Include File Prefix/Motor Positions/Data Time Stamp - if format version > 1
if (mFormatVersion > 1)
{
pixelFile.write((const char*)&mSSX, sizeof(mSSX));
pixelFile.write((const char*)&mSSY, sizeof(mSSY));
pixelFile.write((const char*)&mSSZ, sizeof(mSSZ));
pixelFile.write((const char*)&mSSROT, sizeof(mSSROT));
pixelFile.write((const char*)&mTimer, sizeof(mTimer));
pixelFile.write((const char*)&mGALX, sizeof(mGALX));
pixelFile.write((const char*)&mGALY, sizeof(mGALY));
pixelFile.write((const char*)&mGALZ, sizeof(mGALZ));
pixelFile.write((const char*)&mGALROT, sizeof(mGALROT));
// Determine length of mFilePrefix string
int filePrefixSize = (int)mFilePrefix.size();
// Write prefix length, followed by prefix itself
pixelFile.write((const char*)&filePrefixSize, sizeof(filePrefixSize));
size_t prefixLen = 0;
if (mFormatVersion == 2) prefixLen = mFilePrefix.size();
else prefixLen = 100;
pixelFile.write(mFilePrefix.c_str(), prefixLen);
pixelFile.write(mDataTimeStamp.c_str(), mDataTimeStamp.size());
}
// Continue writing header information that is common to both format versions
pixelFile.write((const char*)&mRows, sizeof(mRows));
pixelFile.write((const char*)&mCols, sizeof(mCols));
pixelFile.write((const char*)&mHistoBins, sizeof(mHistoBins));
// Write the actual data - taken out for briefy sake
// ..
pixelFile.close();
LOG(gLogConfig, logINFO) << "Written output histogram binary file " << aOutputPixelFileName;
return true;
}
-- Edit 2 (11:32 09/12/2015) --
Thank you for all the help, I'm closer to solving the issue now. Going with the answer from muelleth, I try:
/// Read into char buffer
char * buffer = new char[length];
datFile.read(buffer, length);// length determined by ifstream.seekg()
/// Let's try HxtBuffer
HxtBuffer *input = new HxtBuffer;
cout << "sizeof HxtBuffer: " << sizeof *input << endl;
memcpy(input, buffer, length);
I can then display the different struct variables:
qDebug() << "Slice BUFFER label " << QString::fromStdString(input->hxtLabel);
qDebug() << "Slice BUFFER version " << QString::number(input->hxtVersion);
qDebug() << "Slice BUFFER hxtPrefixLength " << QString::number(input->filePrefixLength);
for (int i = 0; i < 9; i++)
{
qDebug() << i << QString::number(input->motorPositions[i]);
}
qDebug() << "Slice BUFFER filePrefix " << QString::fromStdString(input->filePrefix);
qDebug() << "Slice BUFFER dataTimeStamp " << QString::fromStdString(input->dataTimeStamp);
qDebug() << "Slice BUFFER nRows " << QString::number(input->nRows);
qDebug() << "Slice BUFFER nCols " << QString::number(input->nCols);
qDebug() << "Slice BUFFER nBins " << QString::number(input->nBins);
The output is then mostly as expected:
Slice BUFFER label "MY_LABEL"
Slice BUFFER version "3"
Slice BUFFER hxtPrefixLength "2"
0 "2109"
1 "5438"
...
7 "1055"
8 "1066"
Slice BUFFER filePrefix "-1"
Slice BUFFER dataTimeStamp "000000_000000P"
Slice BUFFER nRows "20480"
Slice BUFFER nCols "256000"
Slice BUFFER nBins "0"
EXCEPT, dataTimeStamp, which is 13 chars long, displays instead 14 chars. The 3 variables that follow: nRows, nCols and nBins are then incorrect. (Should be nRows=80, nCols=80, nBins=1000). My guess is that the bits belonging to the 14th char of dataTimeStamp should be read along with nRows, and so cascade on to produce the correct nCols and nBins.
I have separately verified (not shown here) using qDebug that what I'm writing into the file, really are the values I expect, and their individual sizes.
I personally would try to read exactly the number of bytes your struct is from the file, i.e. something like
int length = sizeof(HxtBuffer);
and then simply use memcpy to assign a local structure from the read buffer:
HxtBuffer input;
memcpy(&input, buffer, length);
You can then access your data e.g. like:
std::cout << "Data: " << input.hxtLabel << std::endl;
Why do you read to buffer, instead of using the structure for reading?
HxtBuffer data;
datFile.read(reinterpret_cast<char *>(&data), sizeof data);
if(datFile && datFile.gcount()!=sizeof data)
throw io_exception();
// Can use data.
If you want to read to a chracter buffer, than your way of getting the data is just wrong. You probably want to do something like this.
char *buf_offset=buffer+8+sizeof(u64); // Skip label (8 chars) and version (int64)
int mSSX = *reinterpret_cast<int*>(buf_offset);
buf_offset+=sizeof(int);
int mSSY = *reinterpret_cast<int*>(buf_offset);
buf_offset+=sizeof(int);
int mSSZ = *reinterpret_cast<int*>(buf_offset);
/* etc. */
Or, a little better (provided you don't change the contents of the buffer).
int *ptr_motors=reinterpret_cast<int *>(buffer+8+sizeof(u64));
int &mSSX = ptr_motors[0];
int &mSSY = ptr_motors[1];
int &mSSZ = ptr_motors[2];
/* etc. */
Notice that I don't declare mSSX, mSSY etc. as pointers. Your code was printing them as addresses because you told the compiler that they were addresses (pointers).

Issue with GPB SerializeTo functions

I have the below code.
main()
{
test::RouteMessage *Rtmesg = new test::RouteMessage;
test::RouteV4Prefix *prefix = new test::RouteV4Prefix;
test::RouteMessage testRtmesg;
prefix->set_family(test::RouteV4Prefix::RT_AFI_V4);
prefix->set_prefix_len(24);
prefix->set_prefix(1000);
Rtmesg->set_routetype(test::RouteMessage::RT_TYPE_BGP);
Rtmesg->set_allocated_v4prefix(prefix);
Rtmesg->set_flags(test::RouteMessage::RT_FLGS_NONE);
Rtmesg->set_routeevnt(test::RouteMessage::BGP_EVNT_V4_RT_ADD);
Rtmesg->set_nexthop(100);
Rtmesg->set_ifindex(200); Rtmesg->set_metric(99);
Rtmesg->set_pref(1);
int size = Rtmesg->ByteSize();
char const *rt_msg = (char *)malloc(size);
google::protobuf::io::ArrayOutputStream oarr(rt_msg, size);
google::protobuf::io::CodedOutputStream output (&oarr)
Rtmesg->SerializeToCodedStream(&output);
// Below code is just to see if everything is fine.
google::protobuf::io::ArrayInputtStream iarr(rt_msg, size);
google::protobuf::io::CodedInputStream Input (&iarr)
testRtmesg.ParseFromCodedStream(&Input);
Vpe::RouteV4Prefix test_v4Prefix = testRtmesg.v4prefix();
cout << std::endl;
std::cout << "Family " << test_v4Prefix.family() << std::endl;
std::cout << "Prefix " << test_v4Prefix.prefix()<< std::endl;
std::cout << "PrefixLen " << test_v4Prefix.prefix_len() << std::endl;
// All the above outputs are fine.
cout << std::endl;
cout << rt_msg; <<------------ This prints absolutely junk.
cout << std::endl;
amqp_bytes_t str2;
str2 = amqp_cstring_bytes(rt_msg); <<----- This just crashes.
printf("\n str2=%s %d", str2.bytes, str2.len);
}
Any operation on the above rt_msg just crashes. I want to use the above buffer to send to socket and another rabbitmq publish APIs.
Anybody out there who had similar issue...or worked out similar code ?
Protocol Buffers is a binary serialization format, not text. This means:
Yes, if you write the binary data to cout, it will look like junk (or crash).
The data is not NUL-terminated like C strings. Therefore, you cannot pass it into a function like amqp_cstring_bytes which expects a NUL-terminated char* -- it may cut the data short at the first 0 byte, or it may search for a 0 byte past the end of the buffer and crash. In general, any function that takes a char* but does not also take a length won't work.
I'm not familiar with amqp, but it looks like the function you are trying to call, amqp_cstring_bytes, just builds a amqp_bytes_t, which is defined as follows:
typedef struct amqp_bytes_t_ {
size_t len;
void *bytes;
} amqp_bytes_t;
So, all you have to do is something like:
amqp_bytes_t str2;
str2.bytes = rt_msg;
str2.len = size;

Stock bytes in QVariantMap

I'm getting the bytes data from a QImage in a QByteArray and i want to write the datas from the QByteArray in a QVariantMap, but after stocking the bytes, the bytes are changed and the image is not valid...
I tried to stock the QByteArray directly but the thing is i'm going to receive QVariantMap (as a JSON) from Windows phone, android and iOS. And the QByteArray will not exist on those OS so i doubt that the .toByteArray function will work...
Here is an example of what i tried using a QString but the bytes are changed when the QString is filled...
QFile tmp("default_profile.jpg");
tmp.open(QIODevice::ReadOnly);
if (tmp.exists() == true)
{
QByteArray tab;
tab = tmp.read(tmp.size());
int i = 0;
char *data = tab.data();
QString str;
while (i != tmp.size())
{
if (i < 100)
qDebug() << "AVANT = " << " i = " << i << "[" << *data + '0' << "]";
i++;
str.append(*data);
++data;
}
QVariantMap *tmp = new QVariantMap();
(*tmp)["name"] = "test.jpg";
(*tmp)["data"] = str;
(*tmp)["size"] = tab.size();
(*tmp)["type"] = "PhonePic";
this->fileReceived("", "", tmp);
}
And here is the fileReceived:
QFile tmp((*src)["name"].toString());
tmp.open(QIODevice::ReadWrite | QIODevice::Truncate);
char *test = (char *)malloc((*src)["size"].toInt());
QString str;
str = (*src)["data"].toString();
int i = 0;
char *data = const_cast<char *>(str.toStdString().c_str());
while (i != 8143)
{
if (i < 100)
qDebug() << "AFTER = " << " i = " << i << "[" << *data + '0' << "]";
i++;
++data;
}
qDebug() << tmp.write(str.toStdString().c_str(), (*src)["size"].toInt());
The size of the "AFTER" QString is good, but the values are not good...
Does someone know what i'm doing wrong ? or maybe have an idea of how i get do it ?
Thanks for people who will try to help me.
You load binary data into a QByteArray. Then you append the data in this QByteArray byte by byte to a QString, which converts every single byte into unicode. And last but not least you you just try to cast this back into raw data? No way. So pointing you to what you are doing wrong was easy.
Helping you is much more difficult. Why do you think you can use QString, when QByteArray is not available? When QVariantMap is available, then you have a complete Qt. So using QByteArray should not be a problem. But I must admit, I don't fully understand your post and your problem, so I might overlook something.
You can send a QImage directly in a QVariant or QVariantMap. How the particular json library that you use will deal with the json-to-Qt interface, that's a whole another story.
Since QVariant is in Qt Core and not Qt Gui, you have to use QVariant::value() or the qvariant_cast() template function when working with GUI types such as QImage.

C++/QT - Converting data to different format via QByteArray differs from reading into struct

I am attempting to read in a header. Here is the structure:
struct header
{
uint32 offset;
char identifier[4];
uint32 unknown;
};
When I read copy the memory in via memcpy, I can output the offset correctly (usually as a large number).
When using built in function with the byte array, I read in the whole file and then take the first four bytes by using:
QByteArray offset = data.left(4);
I then verified that it had copied correctly, which it had.
My problem is when it comes to converting these bytes to the appropriate datatype. I have tried:
qDebug() << "Offset1:" << offset.toUShort();
qDebug() << "Offset2:" << offset;
qDebug() << "Offset3:" << offset.toHex();
qDebug() << "Offset4:" << offset.toInt();
qDebug() << "Offset5:" << offset.toLong();
qDebug() << "Offset6:" << offset.toUInt();
qDebug() << "Offset7:" << offset.toULong();
qDebug() << "Offset8:" << offset.toULongLong();
qDebug() << "Offset9:" << offset.toULong();
None of them output the correct value. On the other hand, when I just memcpy and then use:
qDebug() << "Offset:" << header.offset;
I get the correct value. What am I missing in the conversion from the bytes to uint32?
Does it have to do with the endianness?
toInt() etc. convert string repesentations like "100" to integer, not binary representations.
To convert the binary back, you must cast it:
const quint32 v = *reinterpret_cast<const quint32*>( offset.constData() );
Note that this is fragile due to endianness. memcpy with structs is also not safely portable due to struct alignment (padding). Better use QDataStream or something like Boost serialization or Google Protocol Buffers for more robust serialization.

std::cout << stringstream.str()->c_str() prints nothing

in a function, that gets unsigned char && unsigned char length,
void pcap_callback(u_char *args, const struct pcap_pkthdr* pkthdr, const u_char* packet)
{
std::vector<unsigned char> vec(packet, packet+pkthdr->len); // optimized from foo.
std::stringstream scp;
for (int i=0;i<pkthdr->len;i++) {
scp<<vec[i];
}
std::string mystr = std::string(scp.rdbuf()->str());
std::cout << "WAS: " << packet << std::endl;
std::cout << "GOOD: " << scp.str() << std::endl;
std::cout << "BAD: " << scp.str().c_str() << std::endl;
std::cout << "TEST: " << mystr.size() << std::endl;
assert(mystr.size() == pkthdr->len);
}
Results:
WAS: prints nothing (guess there is a pointer to const.. case)
GOOD: prints data
BAD: prints nothing
TEST, assert: prints that mystr.size() is equal to passed unsigned char size.
I tried:
string.assign(scp.rdbuf());
memcpy(char, scp.str(), 10);
different methods of creating/allocating temporary chars, strings
No help.. it is wanted to get a std::cout'able std::string that contains data, (which was picked from foo, which was unsigned char, which was packet data).
Guessing either the original foo may not be null-terminated, or the problem is something like this - simple, but can't get in.. what are the things to look for here?
(this code is another attempt to use libpcap, just to print packets in C++ way, without using known C++ magic wrappers like libpcapp).
For a quick test, throw in a check for scp.str().size() == strlen(scp.str().c_str()) to see if there are embedded '\0' characters in the string, which is what I suspect is happening.
I think you're going about this the wrong way. It looks like you're dealing with binary data here, in which case you can't expect to meaningfully output it to the screen as text. What you really need is a hex dump.
const unsigned char* ucopy = packet;
std::ios_base::fmtflags old_flags = std::cout.flags();
std::cout.setf(std::ios::hex, std::ios::basefield);
for (const unsigned char* p = ucopy, *e = p + pkthdr->len; p != e; ++p) {
std::cout << std::setw(2) << std::setfill('0') << static_cast<unsigned>(*p) << " ";
}
std::cout.flags(old_flags);
This will output the data byte-by-byte, and let you examine the individual hex values of the binary data. A null byte will simply be output as 00.
Check std::cout.good() after the failed output attempt. My guess is that there's some failure on output (i.e. trying to write a nonprintable character to the console), which is setting failbit on cout.
Also check to ensure the string does not start with a NULL, which would cause empty output to be the expected behavior :)
(Side note, please use reinterpret_cast for unsigned char *ucopy = (unsigned char*)packet; if you're in C++ ;) )