Read and write .jpg file in C++ - c++

My supervisor developed a simulator (Its a collection of codes) that reads data from a file and converts them to a signal (for example optical signal, etc.), and then the simulator saves this signal to a .sgn file.
He asked me to read a .jpg image in VS 2019 and convert it to a signal of type byte and then save the signal to .sgn file. However, when I save the signal and change its extension to .jpg (in order to make sure that the signal contains the image data) it cannot be open.
I compared the information of the original image and my signal and I see some extra garbage in the signal as shown in pictures.
Original image
Resulted image(Signal image)
My questions are (thank you so much in advance):
In order to solve this issue, should I read the image header file separately?
Am I reading the image file correctly? (the simulator is huge, so I can not put all the codes)
Do you have any other idea about where the problem is? like the buffer or something else.
std::ifstream inFile;
inFile.open("1.jpg");
std::byte out; // type of output signal
int length = sizeof(std::byte);
char * memblock = new char[length];
for (int i = 0; i < process; i++) { //this line is related to the circular buffer
inFile.read(memblock, length);
std::byte * byte_values = (std::byte*)memblock;
out = *byte_values;
outputSignals[0]->bufferPut(out); // related to saving the output signal
}
delete[] memblock;

jpeg and similar files contain binary data, hence you should open the file in binary mode for the data to be read correctly.
Simple Example

Related

How to convert a string into a Standard C File ?

Hi I have an image in my memory and I want to sent it through an external FTP library.
This FTP library accepts only and only standard C FILE and the sample codes provided by this library reads data only from hard disk. In my application I don't want to store my images in the hard disk and then read them using FILE variable, instead I want to do the conversion in my memory so it's faster and more professional.
My image is in the form of uchar * but I can change it to std::String or QByteArray or any other type of string. Now I want to know how can I have a file which is filled by my image data so I will get rid of storing it into the hard disk and read it again.
My pseudo code:
uchar * image = readImage();
FILE * New_Image = String2FileConverter(image); //I need this function
FTP_Upload(New_Image);
On Posix systems, you can use fmemopen to create a memory-backed file handle.

how to report progress of data read on a QuaGzipFile (QuaZIP library)

I am using QuaZIP 0.5.1 with Qt 5.1.1 for C++ on Ubuntu 12.04 x86_64.
My program reads a large gzipped binary file, usually 1GB of uncompressed data or more, and makes some computations on it. It is not computational-extensive, and most of the time is passed on I/O. So if I can find a way to report how much data of the file is read, I can report it on a progress bar, and even provide an estimation of ETA.
I open the file with:
QuaGzipFile gzip(fileName);
if (!gzip.open(QIODevice::ReadOnly))
{
// report error
return;
}
But there is no functionality in QuaGzipFile to find the file size nor the current position.
I do not need to find size and position of uncompressed stream, the size and position of compressed stream are fine, because a rough estimation of progress is enough.
Currently, I can find size of compressed file, using QFile(fileName).size(). Also, I can easily find current position in uncompressed stream, by keeping sum of return values of gzip.read(). But these two numbers do not match.
I can alter the QuaZIP library, and access internal zlib-related stuff, if it helps.
There is no reliable way to determine total size of uncompressed stream. See this answer for details and possible workarounds.
However, there is a way to get position in compressed stream:
QFile file(fileName);
file.open(QFile::ReadOnly);
QuaGzipFile gzip;
gzip.open(file.handle(), QuaGzipFile::ReadOnly);
while(true) {
QByteArray buf = gzip.read(1000);
//process buf
if (buf.isEmpty()) { break; }
QFile temp_file_object;
temp_file_object.open(file.handle(), QFile::ReadOnly);
double progress = 100.0 * temp_file_object.pos() / file.size();
qDebug() << qRound(progress) << "%";
}
The idea is to open file manually and use file descriptor to get position. QFile cannot track external position changes, so file.pos() will be always 0. So we create temp_file_object from the file descriptor forcing QFile to request file position. I could use some lower level API (such as lseek()) to get file position but I think my way is more cross-platform.
Note that this method is not very accurate and can give progress values bigger than real. That's because zlib can internally read and decode more data than you have already read.
In zlib 1.2.4 and greater you can use the gzoffset() function to get the current position in the compressed file. The current version of zlib is 1.2.8.
Using an ugly hack to zlib, I was able to find position in compressed stream.
First, I copied definition of gz_stream from gzio.c (from zlib-1.2.3.4 source), to the end of quagzipfile.cpp. Then I reimplemented the virtual function qint64 QIODevice::pos() const:
qint64 QuaGzipFile::pos() const
{
gz_stream *s = (gz_stream *)d->gzd;
return ftello64(s->file);
}
Since quagzipfile.cpp and quagzipfile.h seem to be independent from other QuaZIP library files, maybe it is better to copy the functionality I need from these files and avoid this hack?
The current version of program is something like this:
QFile infile(fileName);
if (!infile.open(QIODevice::ReadOnly))
return;
qint64 fileSize = infile.size;
infile.close();
QuaGzipFile gzip(fileName);
if (!gzip.open(QIODevice::ReadOnly))
return;
qint64 nread;
char buffer[bufferSize];
while ((nread = gzip.read(&buffer, bufferSize)) > 0)
{
// use buffer
int percent = 100.0 * gzip.pos() / fileSize;
// report percent
}
gzip.close();

Reading the contents of the dynamically created buffer giving wrong address of memory for second call to function

Using Visual c++ i am trying to read an image from the stream I do this by storing the stream in a buffer. I know that at what location in buffer i have the image.(its the first file in the stream and i know the size of the image so i read and store the image in buffer until the size of file and thats correct.I am sure about it) For the first time when i read the image there is no problem it works correctly. The code is as follows-
ReadFromStream(IStream *pStream )
{//this pStream stream contents the file contents
ULONG cbRead;
int size=5348928;
char *buffer = new char[size + 1];
HRESULT hr = pStream->Read(buffer, size, &cbRead ); //here we store the stream in buffer.Now all the data is in buffer.
buffer[cbRead ] = L'\0';
int location = 512 ;
char FileContents[107643];
memcpy(FileContents,&buffer[location],SizeOfFile); // here i have the contents of the image in File contents.I am sure about it its location. For the first call to ReadFromStream() function it works fine.
}
But my situation is that i have to read the image second time also on the same execution of the program. so what happens when the second time i call to ReadFromStream() function(with the same stream value i can see on debugging the stream value is same.) even then the buffer show the contents which are at location far away from the image stored in it (i mean the stream had Image File as the first file but in the second call to ReadFromStream() the buffer points to the data of another file but the first file was actually the image file). So the quetion is how this memory is alloctaed up to this unexpected file ?
Why the buffer shows the data which is at location very far from the starting index.(For the second call to ReadFromStream() also it should show image file as the starting file. why it show the file which is far away from the Image file ??? ) As I guess some memory is allocated and which must be deleted ?? but where and how i don't know ..am i right ??
may be its because in the second call to ReadFromStream(); this buffer has already some memory allocated i mean for the second call the buffer points to address which don't start from zero (but it should do it as i think)
Streams are like normal files in that they're sequential in nature and once you've read data, the "read cursor" is advanced and another call to Read() will read more data, and so on.
To seek backwards to re-read the same data again, use IStream::Seek(). For example, to go back to the start of the stream:
LARGE_INTEGER li = { 0 };
HRESULT hr = pStream->Seek(li, STREAM_SEEK_SET, NULL);
Not all streams support seeking so you should always check the return code for error.

Protocol Buffers; saving data to disk & loading back issue

I have an issue with storing Protobuf data to disk.
The application i have uses Protocol Buffer to transfer data over a socket (which works fine), but when i try to store the data to disk it fails.
Actually, saving data reports no issues, but i cannot seem to load them again properly.
Any tips would be gladly appreciated.
void writeToDisk(DataList & dList)
{
// open streams
int fd = open("serializedMessage.pb", O_WRONLY | O_CREAT);
google::protobuf::io::ZeroCopyOutputStream* fileOutput = new google::protobuf::io::FileOutputStream(fd);
google::protobuf::io::CodedOutputStream* codedOutput = new google::protobuf::io::CodedOutputStream(fileOutput);
// save data
codedOutput->WriteLittleEndian32(PROTOBUF_MESSAGE_ID_NUMBER); // store with message id
codedOutput->WriteLittleEndian32(dList.ByteSize()); // the size of the data i will serialize
dList.SerializeToCodedStream(codedOutput); // serialize the data
// close streams
delete codedOutput;
delete fileOutput;
close(fd);
}
I've verified the data inside this function, the dList contains the data i expect. The streams report that no errors occur, and that a reasonable amount of bytes were written to disk. (also the file is of reasonable size)
But when i try to read back the data, it does not work. Moreover, what is really strange, is that if i append more data to this file, i can read the first messages (but not the one at the end).
void readDataFromFile()
{
// open streams
int fd = open("serializedMessage.pb", O_RDONLY);
google::protobuf::io::ZeroCopyInputStream* fileinput = new google::protobuf::io::FileInputStream(fd);
google::protobuf::io::CodedInputStream* codedinput = new google::protobuf::io::CodedInputStream(fileinput);
// read back
uint32_t sizeToRead = 0, magicNumber = 0;
string parsedStr = "";
codedinput->ReadLittleEndian32(&magicNumber); // the message id-number i expect
codedinput->ReadLittleEndian32(&sizeToRead); // the reported data size, also what i expect
codedinput->ReadString(&parsedstr, sizeToRead)) // the size() of 'parsedstr' is much less than it should (sizeToRead)
DataList dl = DataList();
if (dl.ParseFromString(parsedstr)) // fails
{
// work with data if all okay
}
// close streams
delete codedinput;
delete fileinput;
close(fd);
}
Obviously i have omitted some of the code here to simplify everything.
As a side note i have also also tried to serialize the message to a string & save that string via CodedOutputStream. This does not work either. I have verified the contents of that string though, so i guess culprit must be the stream functions.
This is a windows environment, c++ with protocol buffers and Qt.
Thank you for your time!
I solved this issue by switching from file descriptors to fstream, and FileCopyStream to OstreamOutputStream.
Although i've seen examples using the former, it didn't work for me.
I found a nice code example in hidden in the google coded_stream header. link #1
Also, since i needed to serialize multiple messages to the same file using protocol buffers, this link was enlightening. link #2
For some reason, the output file is not 'complete' until i actually desctruct the stream objects.
The read failure was because the file was not opened for reading with O_BINARY - change file opening to this and it works:
int fd = open("serializedMessage.pb", O_RDONLY | O_BINARY);
The root cause is the same as here: "read() only reads a few bytes from file". You were very likely following an example in the protobuf documentation which opens the file in the same way, but it stops parsing on Windows when it hits a special character in the file.
Also, in more recent versions of the library, you can use protobuf::util::ParseDelimitedFromCodedStream to simplify reading size+payload pairs.
... the question may be ancient, but the issue still exists and this answer is almost certainly the fix to the original problem.
try to use
codedinput->readRawBytes insead of ReadString
and
dl.ParseFromArray instead of ParseFromString
Not very familiar with protocol buffers but ReadString might only read a field of type strine.

Reading bmp file for steganography

I am trying to read a bmp file in C++(Turbo). But i m not able to print binary stream.
I want to encode txt file into it and decrypt it.
How can i do this. I read that bmp file header is of 54 byte. But how and where should i append txt file in bmp file. ?
I know only Turbo C++, so it would be helpfull for me if u provide solution or suggestion related to topic for the same.
int main()
{
ifstream fr; //reads
ofstream fw; // wrrites to file
char c;
int random;
clrscr();
char file[2][100]={"s.bmp","s.txt"};
fr.open(file[0],ios::binary);//file name, mode of open, here input mode i.e. read only
if(!fr)
cout<<"File can not be opened.";
fw.open(file[1],ios::app);//file will be appended
if(!fw)
cout<<"File can not be opened";
while(!fr)
cout<<fr.get(); // error should be here. but not able to find out what error is it
fr.close();
fw.close();
getch();
}
This code is running fine when i pass txt file in binary mode
EDIT :
while(!fr)
cout<<fr.get();
I am not able to see binary data in console
this was working fine for text when i was passing character parameter in fr.get(c)
I think you question is allready answered:
Print an int in binary representation using C
convert your char to an int and you are done (at least for the output part)
With steganography, what little I know about it, you're not "appending" text. You're making subtle changes to the pixels (shading, etc..) to hide something that's not visually obvious, but should be able to be reverse-decrypted by examining the pixels. Should not have anything to do with the header.
So anyway, the point of my otherwise non-helpful answer is to encourage you go to and learn about the topic which you seek answers, so that you can design your solution, and THEN come and ask for specifics about implementation.
You need to modify the bit pattern, not append any text to the file.
One simple example :
Read the Bitmap Content (after header), and sacrifice a bit from each of the byte to hold your content
If on Windows, recode to use CreateFile and see what the real error is. If on Linux, ditto for open(2). Once you have debugged the problem you can probably shift back to iostreams.