I am writing an application that is supposed to write an array of floats to a WAVE file. I am using a QDataStream for this, but this results in a very improbable output that I can't explain. It seems like the QDataStream sometimes chooses 32 bit floats and sometimes 40 bit floats. This messes up the entire output file, since it has to obey a strict format.
My code roughly looks like this:
float* array;
unsigned int nSamples;
void saveWAV(const QString& fileName) const
{
QFile outFile(fileName);
if (outFile.open(QIODevice::WriteOnly | QIODevice::Text))
{
QDataStream dataStream(&outFile);
dataStream.setByteOrder(QDataStream::LittleEndian);
dataStream.setFloatingPointPrecision(QDataStream::SinglePrecision);
// ... do all the WAV file header stuff ...
for(int ii = 0; ii < nSamples; ++ii)
dataStream << array[ii];
}
}
I can think of no reason of how this code could have such a side-effect. So I made a minimal example to find out what was going on. I replaced the for-loop by this:
float temp1 = 1.63006e-33f;
float temp2 = 1.55949e-32f;
dataStream << temp1;
dataStream << temp1;
dataStream << temp2;
dataStream << temp1;
dataStream << temp2;
Then I opened the output file using Matlab and had a look at the bytes written the file. Those were:
8b 6b 07 09 // this is indeed 1.63006e-33f (notice it's Little Endian)
8b 6b 07 09
5b f2 a1 0d 0a // I don't know what this is but it's a byte to long
8b 6b 07 09
5b f2 a1 0d 0a
I chose the values pretty arbitrarily, they just happened to have this effect. Some values are exported as 4-byte and other ones as 5-byte numbers. Does anyone have any idea what may be the cause of this?
Edit:
When checking the size of both floats, they do seem to be 4 chars long, though:
qDebug() << sizeof(temp1); // prints '4'
qDebug() << sizeof(temp2); // prints '4'
The answer lies in the opening of the output file: the QIODevice::Text flag should have been left out, since it is a binary file. If the text-flag is included, a 0d character is inserted before each 0a. So each float that contains an 0a character seems a char longer because of this.
All credits for this answer go to the answers given in:
Length of float changes between 32 and 40 bit
Note: I'm not 100% sure I'm right below, and would love to hear I'm wrong, but this is how I think it is:
QDataStream has it's own serialization format, and while I did not check, that is probably related to that. Point is, it's not meant for what you are trying to do with it: write just any binary format. You can use the class, but I believe you need to use only writeRawData() method, and take care of byte order etc yourself.
I had a similar issue even though I was not using the IODevice::Text flag. I found that adding a line
dataStream.device()->setTextModeEnabled(false);
solved the problem to make sure you are in the binary mode.
Related
I have a utility that should copy files from one location to another.
The problem I have is when reading X bytes using the QDataStream and writing it, the number of bytes being read/written exceeds the number of bytes the file has. I see this problem happen with a number of files.
I am using a QDataStream::readRawData() and QDataStream::writeRawData() to facilitate reading/writing to and from files as shown below
QDataStream in(&sourceFile);
QDataStream out(&newFile);
// Read/Write byte containers
qint64 fileBytesRead = 0;
quint64 fileBytesWritten = 0;
qint64 bytesWrittenNow = 0;
quint8* buffer = new quint8[bufSize];
while ((fileBytesRead = in.readRawData((char*)buffer, bufSize)) != 0) {
// Check if we have a read/write mismatch
if (fileBytesRead == -1) {
printCritical(TAG, QString("Mismatch read/write: [R:%1/W:%2], total file write/max [W:%3/M:%4]. File may be corrupted, skipping...").arg(QString::number(fileBytesRead), QString::number(bytesWrittenNow), QString::number(fileBytesWritten), QString::number(storageFile.size)));
// close source file handle
sourceFile.close();
// Close file handle
newFile.close();
return BackupResult::IOError;
}
// Write buffer to file stream
bytesWrittenNow = out.writeRawData((const char*)buffer, fileBytesRead);
// Check if we have a read/write mismatch
if (bytesWrittenNow == -1) {
printCritical(TAG, QString("Mismatch read/write: [R:%1/W:%2], total file write/max [W:%3/M:%4]. File may be corrupted, skipping...").arg(QString::number(fileBytesRead), QString::number(bytesWrittenNow), QString::number(fileBytesWritten), QString::number(storageFile.size)));
// close source file handle
sourceFile.close();
// Close file handle
newFile.close();
return BackupResult::IOError;
}
// Add current buffer size to written bytes
fileBytesWritten += bytesWrittenNow;
if(fileBytesWritten > storageFile.size) {
qWarning() << "Extra bytes read/written exceeding file length"; <================= this line is hit every now and then
}
//...
This problem isn't consistent, but it happens every now and then, I have no idea why. Anyone have thoughts on a possible cause?
The name of the function QDataStream::writeRawData() sounds like ideal for writing binary data. Unfortunately, that's only half of the story.
The open-mode of the file is relevant as well under certain conditions – e.g. if the QFile is opened on Windows with QIODevice::Text:
QIODevice::Text
When reading, the end-of-line terminators are translated to '\n'. When writing, the end-of-line terminators are translated to the local encoding, for example '\r\n' for Win32.
I prepared an MCVE to demonstrate that:
// Qt header:
#include <QtCore>
void write(const QString &fileName, const char *data, size_t size, QIODevice::OpenMode mode)
{
qDebug() << "Open file" << fileName;
QFile qFile(fileName);
qFile.open(mode | QIODevice::WriteOnly);
QDataStream out(&qFile);
const int ret = out.writeRawData(data, size);
qDebug() << ret << "bytes written.";
}
// main application
int main(int argc, char **argv)
{
const char data[] = {
'\x00', '\x01', '\x02', '\x03', '\x04', '\x05', '\x06', '\x07',
'\x08', '\x09', '\x0a', '\x0b', '\x0c', '\x0d', '\x0e', '\x0f'
};
const size_t size = sizeof data / sizeof *data;
write("data.txt", data, size, 0);
write("test.txt", data, size, QIODevice::Text);
}
Built and tested in VS2017 on Windows 10:
Open file "data.txt"
16 bytes written.
Open file "test.txt"
16 bytes written.
Result inspected with the help of cygwin:
$ ls -l *.txt
-rwxrwx---+ 1 scheff Domänen-Benutzer 427 Jun 23 08:24 CMakeLists.txt
-rwxrwx---+ 1 scheff Domänen-Benutzer 16 Jun 23 08:37 data.txt
-rwxrwx---+ 1 scheff Domänen-Benutzer 17 Jun 23 08:37 test.txt
$
data.txt has 16 bytes as expected but test.txt has 17 bytes. Oops!
$ hexdump -C data.txt
00000000 00 01 02 03 04 05 06 07 08 09 0a 0b 0c 0d 0e 0f |................|
00000010
$ hexdump -C test.txt
00000000 00 01 02 03 04 05 06 07 08 09 0d 0a 0b 0c 0d 0e |................|
00000010 0f |.|
00000011
$
Obviously, the underlying Windows file function “corrected” the \n to \r\n – 09 0a 0b became 09 0d 0a 0b. Hence, there occurs one additional byte which was not part of the originally written data.
Similar effects may happen when the QFile is opened for reading with QIODevice::Text involved.
How can I read the header of an ADTS encoded aac file? I need it to get the buffer length for each frame to read out the whole aac file. But I can't get the right values. Here is my code to read the header and get the buffer length for each frame(Bit 30 - 43), when assuming big endian:
main(){
ifstream file("audio_adts.m4a", ios::binary);
char header[7],buf[1024];
int framesize;
while(file.read(header,7)) {
memset(buf ,0 , 1024);
/* Get header bit 30 - 42 */
framesize = (header[3]&240|header[4]|header[5]&1);
cout << "Framesize including header: "<<framesize<<endl;
file.read(buf,framesize);
/*Do something with buffer*/
}
return 0;
}
The framesize I get with this code is 65, 45 ,45, 45, -17 and then it stops because of the negative value. The actual framesizes are around 200.
Hexdump of first header:
0x000000: ff f9 50 40 01 3f fc
Your extraction of the framesize appears to have the shifts << missing, needed to get the extracted bit into the right locations
The bit masks does not look like they are matching the /*bit 30-42*/ comment.
Also, change the char to unsigned char as you otherwise will run into all kind of sign extension issues when you are doing this type of bit manipulation (which is the cause for your negative value error)
The way I calculated it:
unsigned int AAC_frame_len = ((AAC_44100_buf[3]&0x03)<<11|(AAC_44100_buf[4]&0xFF)<<3|(AAC_44100_buf[5]&0xE0)>>5);
Just wondering, if I read a PNG file as a binary file, and I know how to write the hex numbers into another plain txt or whatever file, then how can I recreate the PNG file with those hex numbers?
This is the code I use to read from a PNG file and write to another plain txt file:
unsigned char x;
ifile.open("foo.png",ios::binary);
ifile>>noskipws>>hex;
while(ifile>>x){
ofile<<setw(2)<<setfill('0')<<(int)x;
//do some formatting stuff to the ofile, ofile declaration omitted
//some ifs to see if IEND is read in, which is definitely correct
//if IEND, break, so the last four hex numbers in ofile are 49 45 4E 44
}
//read another 4 bytes and write to ofile, which are AE 42 60 82, the check sum
The reason why I am doing this is because I have some PNG files which have some irrelevant messages after IEND chunk, and I want to get rid of them and only keep the chunks related to the actual picture and split them into different files. By "irrelevant messages" I mean they are not the actual part of the picture but I have some other use with them.
It's easy, you just need to read every 2 characters and convert them from hex back to binary.
unsigned char x;
char buf[3] = {0};
ifile.open("foo.hex");
while(ifile>>buf[0]>>buf[1]){
char *end;
x = (unsigned char) strtol(buf, &end, 16);
if (*end == 0) // no conversion error
// output the byte
Hi everyone i have an issue while reading binary data from a binary file as following:
File Content:
D3 EE EE 00 00 01 D7 C4 D9 40
char * afpContentBlock = new char[10];
ifstream inputStream(sInputFile, ios::in|ios::binary);
if (inputStream.is_open()))
{
inputStream.read(afpContentBlock, 10);
int n = sizeof(afpContentBlock)/sizeof(afpContentBlock[0]); // Print 4
// Here i would like to check every byte, but no matter how i convert the
// char[] afpContentBlock, it always cut at first byte 0x00.
}
I know this happens cause of the byte 0x00. Is there a way to manage it somehow ?
I have tried to write it with an ofstream object, and it works fine since it writes out the whole 10 bytes. Anyway i would like to loop through the whole byte array to check bytes value.
Thank you very much.
It's much easier to just get how many bytes you read from the ifstream like so:
if (inputStream.is_open()))
{
inputStream.read(afpContentBlock, 10);
int bytesRead = (int)inputStream.gcount();
for( int i = 0; i < bytesRead; i++ )
{
// check each byte however you want
// access with afpContentBlock[i]
}
}
I'm new to C++ and I've to do an assignment for school.
I need to copy a binary* file without using api calls or system integrated commands. At school we use a windows machine.
I've searched around a bit, and I found out that the best way to copy data without using any api's is to use iostream (ifstream/fstream)
Here's the code I'm using:
int Open(string Name){
int length;
char * buffer;
ifstream is;
fstream out;
FILE* pFile;
is.open (Name.c_str(), ios::binary );
// get length of file:
is.seekg (0, ios::end);
length = is.tellg();
is.seekg (0, ios::beg);
// allocate memory:
buffer = new char [length];
// read data as a block:
is.read (buffer,length);
is.close();
pFile = fopen ( "out.exe" , "w" );
fclose(pFile);
out.open("out.exe", ios::binary);
out.write( buffer, length);
out.close();
delete[] buffer;
return 0;
}
out.exe isnt working properly, and after looking at it in winhex.exe
I see that the data has been modefied, while I'm not doing anything with it
Can anyone help me?
*the file is a simple hello world program, it messageboxes "hello world"
EDIT:
Sorry for my unresponsiveness, It was sleeping.
Anyways, I've opened both (the result and the original) programs inside an hex editor.
It seems that with everything I try this line:
Offset 0 1 2 3 4 5 6 7 8 9 A B C D E F
00000200 4C 00 00 00 00 30 00 00 00 02 00 00 00 0D 0A 00 L 0
Changes into this:
Offset 0 1 2 3 4 5 6 7 8 9 A B C D E F
00000200 4C 00 00 00 00 30 00 00 00 02 00 00 00 0A 00 00 L 0
As you can or cannot see somehow during the reading or writing process a byte is being removed (or added, that sometimes happens as well)
Passing only ios_base::binary to fstream's ctor is not specified (in and/or out must be supplied too).
To avoid that, you could use ofstream (note the exra 'o') for out instead of fstream. As a bonus, this would avoid the need to first fopen with the "w" flag since ofstream's ctor creates the file by default.
is.read(buffer, length) is not guaranteed to read length bytes.
I forget if the same is true for out.write or not.
Lets make that a bit neater:
// Pass strings by const reference (just good habit)
// But may also save a copy. And it indicates that the function should
// not be messing with the name!
int Open(std::string const& Name, std::string const& out)
{
// Declare variables as close to use as possable.
// It is very C-Like to declare all the variables at the
// head of a function.
// Use the constructor to open the file.
std::ifstream is(Name.c_str(), ios::binary);
if (!is) // Failed to open
{ return -1;
}
// get length of file:
is.seekg (0, ios::end);
std::size_t length = is.tellg(); // Use the correct type. int is not correct
is.seekg (0, ios::beg);
// allocate memory:
// Using new/delete is risky. It makes the code not exception safe.
// Also because you have to manually tidy up the buffer you can not
// escape early. By using RAII the cleanup becomes automative and there
// is no need to track resources that need to be tidied.
//
// Look up the concept of RAII it makes C++ lfe so much easier.
// std::vector implements the new/delete internally using RAII
std::vector<char> buffer(length);
std::size_t read = 0;
do
{
// read does not gurantee that it will read everything asked for.
// so you need to do int a loop if you want to read the whole thing
// into a buffer.
is.read(&buffer[read], length - read);
std::size_t amount = is.gcount();
if (amount == 0)
{ return -2; // Something went wrong and it failed to read.
}
read += amount;
} while(length != read);
fstream out(out.c_str(), ios::binary );
if (!out)
{ return -3; // you may want to test this before spending all the time reading
}
// Probably need to loop like we did for read.
out.write( &buffer[0], length);
return 0;
}
Generally, files end in a newline. That 0d0a ("\r\n") might not be a readable part of the source file. Windows usually uses "\r\n" for newline, while UNIX uses just "\n". For some reason, when it writes a new file, it's using just 0a for the final newline. It might be interesting to see what happens if you read in and copy the file you wrote the first time.
The short answer is, this is just the kind of problem that crops up when you use a Windows system. :D
To hack it, you could always unconditionally write an extra "\r" as the last thing you output.
I think that
ifstream src(source.c_str(), ios::binary);
ofstream dest(destination.c_str(), ios::binary | ios::trunc);
dest << src.rdbuf();
src.close();
dest.close();
would do the trick.