Read and write in c++ - c++

I am trying to use the system calls read() and write(). The following program creates a file and writes some data into it. Here is the code..
int main()
{
int fd;
open("student",O_CREAT,(mode_t)0600);
fd=open("student",O_WRONLY);
char data[128]="Hi nikhil, How are u?";
write(fd,data,128);
}
Upon the execution of the above program i got a file with name student created with size as 128 bytes.
int main()
{
int fd=open("student",O_WRONLY);
char data[128];
read(fd,data,128);
cout<<(char*)data<<endl;
}
But the output i get is junk characters....why is this so?
I wrote a small read program to read data from the file. Her is the code.
But the output

Don't read from a file that you've open in O_WRONLY mode!
Do yourself a favor and always check the return values of IO functions.
You should also always close file descriptors you've (successfully) opened. Might not matter for trivial code like this, but if you get into the habit of forgetting that, you'll end up writing code that leaks file descriptors, and that's a bad thing.

You're not checking whether read() returns an error. You should do so, because that's probably the case with the code in your question.
Since you're opening the file write-only in the first place, calling read() on it will result in an error. You should open the file for reading instead:
char data[128];
int fd = open("student", O_RDONLY);
if (fd != -1) {
if (read(fd, data, sizeof(data)) != -1) {
// Process data...
}
close(fd);
}

Well, one of the first things is that your data is not 128 bytes. Your data is the string: "Hi nikhil, How are u?", which is way less than 128 bytes. But you're writing 128 bytes from the array to the file. Everything after the initial string will be random junk from memory as the char array is only initialized with 21 bytes of data. So the next 107 bytes is junk.

Related

Unable to correctly read bmp file

I am trying to read certain information from a bmp file. Basically file type i.e B M in my bmp file. I start with first opening the file. Which is happening correctly. The first fread is however failing. Why is this happening?
#include<stdio.h>
#include<string.h>
#define SIZE 1
int main(void)
{
FILE* fd = NULL;
char buff[2];
unsigned int i=0,size=0,offset=0;
memset(buff,0,sizeof(buff));
fd = fopen("RIT.bmp","r+");
if(NULL == fd)
{
printf("\n fopen() Error!!!\n");
return 1;
}
printf("\n File opened successfully\n");
if(SIZE*2 != fread(buff,SIZE,2,fd))//to read the file type.(i. e. B M)
{
printf("\n first fread() failed\n");
return 1;
}
return 0;
}
Output
File opened successfully
first fread() failed
Press any key to continue . . .
Update
Yes the file is empty, due to some earlier error. That is why this error is coming.
Probably your file doesn't have enough(2 bytes) data. Its giving correct output when I checked with file> 2 bytes. Same is failing for empty file
From the man page: "Upon successful completion, fread() shall return the number of elements successfully read [...]."
That would be 2, not SIZE*2.
Although, at second thought, SIZE is 1, so while the program is error-prone, it is not actually wrong. In that case, the second part of the sentence applies: " ... which is less than nitems only if a read error or end-of-file is encountered.". And as others said, check the global errno if the file is long enough. Maybe it's time for a new SSD.

Size error on read file

RESOLVED
I'm trying to make a simple file loader.
I aim to get the text from a shader file (plain text file) into a char* that I will compile later.
I've tried this function:
char* load_shader(char* pURL)
{
FILE *shaderFile;
char* pShader;
// File opening
fopen_s( &shaderFile, pURL, "r" );
if ( shaderFile == NULL )
return "FILE_ER";
// File size
fseek (shaderFile , 0 , SEEK_END);
int lSize = ftell (shaderFile);
rewind (shaderFile);
// Allocating size to store the content
pShader = (char*) malloc (sizeof(char) * lSize);
if (pShader == NULL)
{
fputs ("Memory error", stderr);
return "MEM_ER";
}
// copy the file into the buffer:
int result = fread (pShader, sizeof(char), lSize, shaderFile);
if (result != lSize)
{
// size of file 106/113
cout << "size of file " << result << "/" << lSize << endl;
fputs ("Reading error", stderr);
return "READ_ER";
}
// Terminate
fclose (shaderFile);
return 0;
}
But as you can see in the code I have a strange size difference at the end of the process which makes my function crash.
I must say I'm quite a beginner in C so I might have missed some subtilities regarding the memory allocation, types, pointers...
How can I solve this size issue?
*EDIT 1:
First, I shouldn't return 0 at the end but pShader; that seemed to be what crashed the program.
Then, I change the type of reult to size_t, and added a end character to pShader, adding pShdaer[result] = '/0'; after its declaration so I can display it correctly.
Finally, as #JamesKanze suggested, I turned fopen_s into fopen as the previous was not usefull in my case.
First, for this sort of raw access, you're probably better off
using the system level functions: CreateFile or open,
ReadFile or read and CloseHandle or close, with
GetFileSize or stat to get the size. Using FILE* or
std::filebuf will only introduce an additional level of
buffering and processing, for no gain in your case.
As to what you are seeing: there is no guarantee that an ftell
will return anything exploitable as a numeric value; it could
very well be just a magic cookie. On most current systems, it
is a byte offset into the physical file, but on any non-Unix
system, the offset into the physical file will not map directly
to the logical file you are reading unless you open the file in
binary mode. If you use "rb" to open the file, you'll
probably see the same values. (Theoretically, you could get
extra 0's at the end of the file, but practically, the OS's
where that happened are either extinct, or only used on legacy
mainframes.)
EDIT:
Since the answer stating this has been deleted: you should loop
on the fread until it returns 0 (setting errno to 0 before
each call, and checking it after the return to see whether the
function returned because of an error or because it reached the
end of file). Having said this: if you're on one of the usual
Windows or Unix systems, and the file is local to the machine,
and not too big, fread will read it all in one go. The
difference in size you are seeing (given the numerical values
you posted) is almost certainly due to the fact that the two
byte Windows line endings are being mapped to a single '\n'
character. To avoid this, you must open in binary mode;
alternatively, if you really are dealing with text (and want
this mapping), you can just ignore the extra bytes in your
buffer, setting the '\0' terminator after the last byte
actually read.

C++ reading leftover data at the end of a file

I am taking input from a file in binary mode using C++; I read the data into unsigned ints, process them, and write them to another file. The problem is that sometimes, at the end of the file, there might be a little bit of data left that isn't large enough to fit into an int; in this case, I want to pad the end of the file with 0s and record how much padding was needed, until the data is large enough to fill an unsigned int.
Here is how I am reading from the file:
std::ifstream fin;
fin.open('filename.whatever', std::ios::in | std::ios::binary);
if(fin) {
unsigned int m;
while(fin >> m) {
//processing the data and writing to another file here
}
//TODO: read the remaining data and pad it here prior to processing
} else {
//output to error stream and exit with failure condition
}
The TODO in the code is where I'm having trouble. After the file input finishes and the loop exits, I need to read in the remaining data at the end of the file that was too small to fill an unsigned int. I need to then pad the end of that data with 0's in binary, recording enough about how much padding was done to be able to un-pad the data in the future.
How is this done, and is this already done automatically by C++?
NOTE: I cannot read the data into anything but an unsigned int, as I am processing the data as if it were an unsigned integer for encryption purposes.
EDIT: It was suggested that I simply read what remains into an array of chars. Am I correct in assuming that this will read in ALL remaining data from the file? It is important to note that I want this to work on any file that C++ can open for input and/or output in binary mode. Thanks for pointing out that I failed to include the detail of opening the file in binary mode.
EDIT: The files my code operates on are not created by anything I have written; they could be audio, video, or text. My goal is to make my code format-agnostic, so I can make no assumptions about the amount of data within a file.
EDIT: ok, so based on constructive comments, this is something of the approach I am seeing, documented in comments where the operations would take place:
std::ifstream fin;
fin.open('filename.whatever', std::ios::in | std::ios::binary);
if(fin) {
unsigned int m;
while(fin >> m) {
//processing the data and writing to another file here
}
//1: declare Char array
//2: fill it with what remains in the file
//3: fill the rest of it until it's the same size as an unsigned int
} else {
//output to error stream and exit with failure condition
}
The question, at this point, is this: is this truly format-agnostic? In other words, are bytes used to measure file size as discrete units, or can a file be, say, 11.25 bytes in size? I should know this, I know, but I've got to ask it anyway.
Are bytes used to measure file size as discrete units, or can a file be, say, 11.25 bytes in size?
No data type can be less than a byte, and your file is represented as an array of char meaning each character is one byte. Thus it is impossible to not get a whole number measure in bytes.
Here is step one, two, and three as per your post:
while (fin >> m)
{
// ...
}
std::ostringstream buffer;
buffer << fin.rdbuf();
std::string contents = buffer.str();
// fill with 0s
std::fill(contents.begin(), contents.end(), '0');

C++ copying files. Short on data

I'm trying to copy a file, but whatever I try, the copy seems to be a few bytes short.
_file is an ifstream set to binary mode.
void FileProcessor::send()
{
//If no file is opened return
if(!_file.is_open()) return;
//Reset position to beginning
_file.seekg(0, ios::beg);
//Result buffer
char * buffer;
char * partBytes = new char[_bufferSize];
//Packet *p;
//Read the file and send it over the network
while(_file.read(partBytes,_bufferSize))
{
//buffer = Packet::create(Packet::FILE,std::string(partBytes));
//p = Packet::create(buffer);
//cout<< p->getLength() << "\n";
//writeToFile(p->getData().c_str(),p->getLength());
writeToFile(partBytes,_bufferSize);
//delete[] buffer;
}
//cout<< *p << "\n";
delete [] partBytes;
}
_writeFile is the file to be written to.
void FileProcessor::writeToFile(const char *buffer,unsigned int size)
{
if(_writeFile.is_open())
{
_writeFile.write(buffer,size);
_writeFile.flush();
}
}
In this case I'm trying to copy a zip file.
But opening both the original and copy in notepad I noticed that while they look identical , It's different at the end where the copy is missing a few bytes.
Any suggestions?
You are assuming that the file's size is a multiple of _bufferSize. You have to check what's left on the buffer after the while:
while(_file.read(partBytes,_bufferSize)) {
writeToFile(partBytes,_bufferSize);
}
if(_file.gcount())
writeToFile(partBytes, _file.gcount());
Your while loop will terminate when it fails to read _bufferSize bytes because it hits an EOF.
The final call to read() might have read some data (just not a full buffer) but your code ignores it.
After your loop you need to check _file.gcount() and if it is not zero, write those remaining bytes out.
Are you copying from one type of media to another? Perhaps different sector sizes are causing the apparent weirdness.
What if _bufferSize doesn't divide evenly into the size of the file...that might cause extra bytes to be written.
You don't want to always do writeToFile(partBytes,_bufferSize); since it's possible (at the end) that less than _bufferSize bytes were read. Also, as pointed out in the comments on this answer, the ifstream is no longer "true" once the EOF is reached, so the last chunk isn't copied (this is your posted problem). Instead, use gcount() to get the number of bytes read:
do
{
_file.read(partBytes, _bufferSize);
writeToFile(partBytes, (unsigned int)_file.gcount());
} while (_file);
For comparisons of zip files, you might want to consider using a non-text editor to do the comparison; HxD is a great (free) hex editor with a file compare option.

How do I read the results of a system() call in C++?

I'm using the following code to try to read the results of a df command in Linux using popen.
#include <iostream> // file and std I/O functions
int main(int argc, char** argv) {
FILE* fp;
char * buffer;
long bufSize;
size_t ret_code;
fp = popen("df", "r");
if(fp == NULL) { // head off errors reading the results
std::cerr << "Could not execute command: df" << std::endl;
exit(1);
}
// get the size of the results
fseek(fp, 0, SEEK_END);
bufSize = ftell(fp);
rewind(fp);
// allocate the memory to contain the results
buffer = (char*)malloc( sizeof(char) * bufSize );
if(buffer == NULL) {
std::cerr << "Memory error." << std::endl;
exit(2);
}
// read the results into the buffer
ret_code = fread(buffer, 1, sizeof(buffer), fp);
if(ret_code != bufSize) {
std::cerr << "Error reading output." << std::endl;
exit(3);
}
// print the results
std::cout << buffer << std::endl;
// clean up
pclose(fp);
free(buffer);
return (EXIT_SUCCESS);
}
This code is giving me a "Memory error" with an exit status of '2', so I can see where it's failing, I just don't understand why.
I put this together from example code that I found on Ubuntu Forums and C++ Reference, so I'm not married to it. If anyone can suggest a better way to read the results of a system() call, I'm open to new ideas.
EDIT to the original: Okay, bufSize is coming up negative, and now I understand why. You can't randomly access a pipe, as I naively tried to do.
I can't be the first person to try to do this. Can someone give (or point me to) an example of how to read the results of a system() call into a variable in C++?
You're making this all too hard. popen(3) returns a regular old FILE * for a standard pipe file, which is to say, newline terminated records. You can read it with very high efficiency by using fgets(3) like so in C:
#include <stdio.h>
char bfr[BUFSIZ] ;
FILE * fp;
// ...
if((fp=popen("/bin/df", "r")) ==NULL) {
// error processing and return
}
// ...
while(fgets(bfr,BUFSIZ,fp) != NULL){
// process a line
}
In C++ it's even easier --
#include <cstdio>
#include <iostream>
#include <string>
FILE * fp ;
if((fp= popen("/bin/df","r")) == NULL) {
// error processing and exit
}
ifstream ins(fileno(fp)); // ifstream ctor using a file descriptor
string s;
while (! ins.eof()){
getline(ins,s);
// do something
}
There's some more error handling there, but that's the idea. The point is that you treat the FILE * from popen just like any FILE *, and read it line by line.
Why would std::malloc() fail?
The obvious reason is "because std::ftell() returned a negative signed number, which was then treated as a huge unsigned number".
According to the documentation, std::ftell() returns -1 on failure. One obvious reason it would fail is that you cannot seek in a pipe or FIFO.
There is no escape; you cannot know the length of the command output without reading it, and you can only read it once. You have to read it in chunks, either growing your buffer as needed or parsing on the fly.
But, of course, you can simply avoid the whole issue by directly using the system call df probably uses to get its information: statvfs().
(A note on terminology: "system call" in Unix and Linux generally refers to calling a kernel function from user-space code. Referring to it as "the results of a system() call" or "the results of a system(3) call" would be clearer, but it would probably be better to just say "capturing the output of a process.")
Anyway, you can read a process's output just like you can read any other file. Specifically:
You can start the process using pipe(), fork(), and exec(). This gives you a file descriptor, then you can use a loop to read() from the file descriptor into a buffer and close() the file descriptor once you're done. This is the lowest level option and gives you the most control.
You can start the process using popen(), as you're doing. This gives you a file stream. In a loop, you can read using from the stream into a temporary variable or buffer using fread(), fgets(), or fgetc(), as Zarawesome's answer demonstrates, then process that buffer or append it to a C++ string.
You can start the process using popen(), then use the nonstandard __gnu_cxx::stdio_filebuf to wrap that, then create an std::istream from the stdio_filebuf and treat it like any other C++ stream. This is the most C++-like approach. Here's part 1 and part 2 of an example of this approach.
I'm not sure you can fseek/ftell pipe streams like this.
Have you checked the value of bufSize ? One reason malloc be failing is for insanely sized buffers.
Thanks to everyone who took the time to answer. A co-worker pointed me to the ostringstream class. Here's some example code that does essentially what I was attempting to do in the original question.
#include <iostream> // cout
#include <sstream> // ostringstream
int main(int argc, char** argv) {
FILE* stream = popen( "df", "r" );
std::ostringstream output;
while( !feof( stream ) && !ferror( stream ))
{
char buf[128];
int bytesRead = fread( buf, 1, 128, stream );
output.write( buf, bytesRead );
}
std::string result = output.str();
std::cout << "<RESULT>" << std::endl << result << "</RESULT>" << std::endl;
return (0);
}
To answer the question in the update:
char buffer[1024];
char * line = NULL;
while ((line = fgets(buffer, sizeof buffer, fp)) != NULL) {
// parse one line of df's output here.
}
Would this be enough?
First thing to check is the value of bufSize - if that happens to be <= 0, chances are that malloc returns a NULL as you're trying to allocate a buffer of size 0 at that point.
Another workaround would be to ask malloc to provide you with a buffer of the size (bufSize + n) with n >= 1, which should work around this particular problem.
That aside, the code you posted is pure C, not C++, so including is overdoing it a little.
check your bufSize. ftell can return -1 on error, and this can lead to nonallocation by malloc with buffer having a NULL value.
The reason for the ftell to fail is, because of the popen. You cant search pipes.
Pipes are not random access. They're sequential, which means that once you read a byte, the pipe is not going to send it to you again. Which means, obviously, you can't rewind it.
If you just want to output the data back to the user, you can just do something like:
// your file opening code
while (!feof(fp))
{
char c = getc(fp);
std::cout << c;
}
This will pull bytes out of the df pipe, one by one, and pump them straight into the output.
Now if you want to access the df output as a whole, you can either pipe it into a file and read that file, or concatenate the output into a construct such as a C++ String.