I think I’m a kind of at a loss here. I trying such a simple thing, I can’t believe that there is nothing build-in Qt (using Qt 5.6.2). I try to convert the data inside a QByteArray from big endian to little endian. Always starts with the same test QByteArray like this.
QByteArray value;
value.append(0x01);
value.append(0x02);
value.append(0x03);
qDebug() << "Original value is: " << value.toHex(); // “010203” like expected
What I need is the little endian, which means the output should be “030201”. Is there any build in thing so far in the Qt Framework? I don’t find one. What I tried so far
// Try with build in QtEndian stuff
QByteArray test = qToLittleEndian(value);
qDebug() << "Test value is: " << test.toHex(); // 010203
// Try via QDataStream
QByteArray data;
QDataStream out(&data, QIODevice::ReadWrite);
out.setByteOrder(QDataStream::LittleEndian);
out << value;
qDebug() << "Changed value is: " << data.toHex(); // "03000000010203"
Any good idea? Or do I really need to shift hands by hand? Found nothing helpfull on SO or on google or maybe ask the wrong question...
It sounds like you want to reverse the array, rather than manipulate the endianness of any of the multi-byte types inside the array. The standard library has a solution for this:
QByteArray arr;
std::reverse(arr.begin(), arr.end());
// arr is now in reversed order
Related
I have a textedit where I key in the hex number, and then this text to be converted into QByteArray.
This is my code:
QByteArray parsedValue = QByteArray::fromHex(expectedPacketStr.toUtf8());
qDebug() << parsedValue;
when I set it to 001102,
then console log reports "\x00\x11\x02" which is what I expected.
But if i set it to 001122,
the console logs reports "\x00\x11\"" which is missing the x22 byte.
I really cannot understand what's going on. Anybody have any clue why is this so???
0x22 is the character " in ascii, so it's just qDebug() that is interpreting it, and nothing is missing inside QByteArray.
To convince you, you can always display the array one by one:
for (auto b : parsedValue)
qDebug() << (int)b;
I've replicated this in two places in my code, one written by me and the one I'm posting an image of, that was written by someone else. I can't get the base64 to output to qDebug at all. I thought base64 was supposed to be readable. It has a size. But it won't print the entire qDebug line.
Thanks in advance for any help.
Here's the code. I'm on Qt Kit 5.12.1 64 bit mingw in release.
QFile* file = new QFile("C:\\Qr-Pic\\Poll_Directory\\IMG_00000001 - Copy (53).jpg");
file->open(QIODevice::ReadOnly);
QByteArray image = file->readAll();
int originalSize = image.length();
QString encoded = QString(image.toBase64());
int encodedSize = encoded.size();
qDebug() << "encodedSize=" << encodedSize;
qDebug() << "encode=" << encoded;
Output:
encodedSize= 34036
I know this title might sound confusing. I have a simple question which I haven't been able to solve yet.
Lets imagine I have a file, opening it with an Hex Editor shows it has two characters inside, say 0x1B and 0x00 (obviously unprintable). I'd like to take that as 1B00 in HEX, which is 6912 in DEC, as opposed to directly converting the characters which would be wrong and is what all other questions I saw asked. Well, thats the task I want to do here. Seems simple, but everything I've tried just does it wrong! Even though I am obviously opening the file in binary mode.
I have only managed to read the characters individually, and mess around a bit, but never do what I actually want, which is as simple as taking those 2 hex characters, interpreting them as an Hex Number, and then convert it to Decimal.
Sorry for any unclear idea, Im not a native speaker. Any help will be appreciated, Im sure you'll think this was quite a noobish question :P
EDIT: Sorry, apparently I didn't explain myself properly. I know this might seem abstract, but it is a really concrete little thing which I have struggled to get solved, yet I haven't been able. Maybe I can ask it another way:
How can I scan a character in binary mode, lets say 0x1B, and convert that to actual 1B characters. Just that.
Sounds like you want to read the file as raw data, and then display it on the screen in decimal? Super easy!
int main() {
std::ifstream myfile("filename.data", std::ofstream::binary);
uint16_t number;
char* buffer = (char*)(&number);
while(myfile.read(buffer, sizeof(number))) {
std::cout << number << ' ';
}
}
The reason it's so easy is that there's no hexidecimal involved. The file is saved as a series of bytes, each byte holds one of 256 values. They aren't hex, they're just a series of values. If you read two bytes into the uint16_t, that is the easiest way to interpret two bytes as a single unsigned 2 byte value. And streaming out a uint16_t will, by default, display that value in decimal. There's no hexidecimal involved. The hexidecimal you saw in the hex editor was because a hex editor interprets the bytes as hex values.
If all you want to do is print a number in hexadecimal form, use std::hex
int i = 0x1B;
std::cout << std::hex << i << std::endl;
std::ifstream infile("test.bin", std::ofstream::binary);
while (true)
{
char c1 = ifs.get();
if (!infile.good())
{
break;
}
char c2 = ifs.get();
if (!infile.good())
{
break;
}
int num = (int)c1 |((int)c2 << 8);
// if you need the oppisite order then
// int num = (int)c2 &((int)c1 << 8);
cout << num;
}
In a first time i want to thanks HostileFork to help me to explain my problem.
Thanks you !
i'm trying to build a client and a server who send their data through a binary protocol.
my problem is i want to send a class from a QT client to a Boost Server. My header(one integer who is the size of my class) is writting on the socket. When i want to read the header on the server side, i can't get the good integer(instead of that i have an big number like -13050660). I think that the problem come to the deserialization on the server but i am not sure.
This is the technique that my Qt client code uses to write the number 10 to onto a socket:
QByteArray paquet;
QDataStream out(&paquet, QIODevice::WriteOnly);
out << (quint32) 0;
out.device()->seek(0);
out << (quint32) (10);
cout << "Writing " << sizeof(quint32) << " bytes to socket." << endl;
Then I try to read it on a server process, which uses boost's async_read():
this->Iheader.resize(size, '\0'); // Iheader is a vector of char
async_read(
this->socket,
buffer(this->Iheader),
bind(
&Client::endRead,
cli,
placeholders::error,
placeholders::bytes_transferred)
);
Here's the function that operates on the string result:
#ifdef WIN32
#define MYINT INT32
#include <Windows.h>
#else
#define MYINT int
#endif
void Client::endRead(const error_code& error, size_t nbytes)
{
if (!error && nbytes == sizeof(MYINT)) {
cout << "Read " << sizeof(MYINT) << " bytes from a socket." << endl;
istringstream stream(this->connection->getIheader(nbytes));
stream >> this->Isize;
cout << "Integer value read was " << this->Isize << endl;
} else {
cout << "Could not read " << sizeof(MYINT) << " bytes." << endl;
}
}
I do get the 32-bit signed integer (4 bytes), but it is not ten, instead it is something like -1163005939. Anyone have and ideas why this is not working?
The server and the client are both on launching on Windows7 pro, 64-bit.
You're welcome...and thanks for following my suggestions on editing the question, and doing the requisite effort to pinpoint the problem more clearly. So now I can tell you what's wrong. :)
The behavior of the << and >> are different on QDataStream than on C++ standard IOstreams. In the world of classes like std::stringstream these operators are called "inserters"/"extractors" and are intended for dealing with information formatted as text. If you want to read a certain number of bytes into a memory address, what you'll want is:
http://www.cplusplus.com/reference/iostream/istream/read/
(Note that if you wish to read binary data out of something that is not a stringstream, you need to be using ios::binary to keep it from messing with line ending conversions)
QDataStream doesn't follow that convention...it's a good helper for binary data. Nothing wrong with that...since abstractly speaking the << and >> operators are available in the language to be overloaded to do whatever you want within your own class hierarchies. Qt was free to define its own semantics for its own streams, and they did.
Do heed the advice given by #vitakot about (if possible) using the same methodology for both input and output. Also heed my warning about byte-ordering issues that start to come up if you aren't careful.
(Good news is that if you are using QDataStream it finesses this issue by taking care of it for you.)
Be aware that in your code as written, your stringstream is making a copy of the buffer in order to read from it. I'm not experienced with boost::asio or the best practices of async_read, but I'm sure there are better ways you might dig around and find.
HostileFork is right, from the information we have it is not possible to isolate a bug in your code.
However, I would suggest you to use boost serialization in your Qt client as well. There is no reason to not combine Boost and Qt libraries. Otherwise you will have to deal with a lot of troubles when sending more complicated classes over the network...
I try to send a class' size from a Qt client to a Boost server (I made both).
This is the class I want to serialize
class Commande
{
public:
std::string login;
std::string mdp;
std::string IP;
std::string to;
std::string from;
bool rep;
int nbCmd;
};
This is the function I use to serialize and send the Commande size and the object.
_socket is a QTcpSocket
void BNetwork::sendData(void)
{
QByteArray paquet;
QDataStream out(&paquet, QIODevice::WriteOnly);
Commande cmd;
cmd.setCmd(1);
cmd.setFrom("Paris");
cmd.setIP("127.0.0.1");
cmd.setLogin("test1");
cmd.setMdp("mdp1");
cmd.setRep(0);
cmd.setTo("maryline");
out << (quint32) 0;
out << cmd;
out.device()->seek(0);
out << (paquet.size() - (int)sizeof(quint32));
this->_socket.write(paquet);
}
QDataStream &operator<<(QDataStream &out, Commande &cmd)
{
QString test(cmd.from.c_str());
out << (quint32)cmd.nbCmd;
out << test;
test = cmd.IP.c_str();
out << test;
test = cmd.mdp.c_str();
out << test;
out << (quint32)cmd.rep;
test = cmd.to.c_str();
out << test;
return out;
}
This is the function I use to convert the header size I received from the QT client.
std::string save = this->connection->getIheader(); \\ this is the string i read on the socket
std::istringstream stream(save);
std::cout << save << std::endl;
if (!(stream >> std::dec >> this->Isize))
throw my_exception("error in endRead()");
For an unknown reason, save contains "1000", and when I try to convert this string into an integer, it's not working so I think the value I get isn't correct.
The client and the server are running on windows 7 64bits.
Do you have a solution to my problem?
First, confirm that when you use out.device()->seek(0), you cause the next writes to prepend as you expect, rather than overwriting the data you've already written, as I expect.
Then, consult this answer to a question that very likely has the same or similar problem.
Let me know in a comment if you need more help.
ETA:QStringstores data as 16 bitQChar, in order to support unicode. std::stringandstd::istringstreamare reading 8 bitchar. See also this answer. The QChars serialized by QT are likely to cause you trouble down the line. Note also that any 0 (null) char that happens to be in the char sequence returned by getIheader() will terminate std::string's assignment operator.
I recommend replacing thestd::istringstreamwith aQDataStream, and using that class to read out your data to exactly the types you originally wrote. You can cast from QT types to c++ native types after you've put serialization behind you. Otherwise, you'll have to take lots of care to figure out what QT is doing under the hood & match it by your own effort.