Below is the code:
#include <iostream>
#include <stdio.h>
#include <unistd.h>
#include <fstream>
#include <memory.h>
int main()
{
std::ifstream file;
file.open("/proc/meminfo");
if(file.fail())
return 0;
file.seekg(0, std::ios::end);
int fileLen = file.tellg();
file.seekg(0, std::ios::beg);
char buffer[fileLen + 1];
memset(buffer, 0, fileLen + 1);
file.read(buffer, fileLen + 1);
if(file.fail())
return 0;
unsigned long long total = 0;
unsigned long long free = 0;
sscanf(buffer, "%*s %llu%*s%llu", &total, &free);
file.close();
return 1;
}
In the code ,fileLen is -1, but I don't know the reason. If ifstream opens a different file, like 1.txt, the program is correct.
at last,thanks for your help
The contents of /proc are not real files, and hence don't have actual sizes. Do not attempt to get their sizes, but instead simply read and parse them normally.
Because this is not a ordinary file:
The proc filesystem is a pseudo-filesystem rooted at /proc that contains user-accessible objects that pertain to the runtime state of the kernel and, by extension, the executing processes that run on top of it. "Pseudo" is used because the proc filesystem exists only as a reflection of the in-memory kernel data structures it displays. This is why most files and directories within /proc are 0 bytes in size.
I think the reason could be /proc/meminfo is not actually a file. /proc does not contain real files, they are just snapshot of the current state of the system.
http://tldp.org/LDP/Linux-Filesystem-Hierarchy/html/proc.html
Related
I tried everything i could think of but for some reason it doesn't store the data from the file to "data", but the file has the written data.
#include <fcntl.h>
#include <unistd.h>
#include <errno.h>
#include <string.h>
#include <stdlib.h>
#include <iostream>
using namespace std;
int main()
{
char data[69]=" ";
int fd = open("./MyFile.txt", O_RDWR | O_CREAT | O_SYNC | O_RSYNC);
write(fd, "HELLO", 5);
read(fd, data, 5);
cout << data << endl;
return 0;
}
Could you guys please help me. I'm trying to learn file I/O and I don't know if it's the O_RDWR or whatever that is wrong here.
From man write:
For a seekable file (i.e., one to which lseek(2) may be applied, for example, a regular file) writing takes place at the current file offset, and the file offset is incremented by the number of bytes actually written.
From man read:
On files that support seeking, the read operation commences at the current file offset, and the file offset is incremented by the number of bytes read.
You need to seek back to the start of the file after your write, if you want to then read what you just wrote:
lseek(fd, 0, SEEK_SET);
Always read and study the documentation for the functions that you use, particularly when they're not doing what you thought they would do.
The file descriptor position is at the end after the write(2). To read(2) from the beginning, you need rewind the fd back to the beginning.
You can do with lseek:
write(fd, "HELLO", 5);
lseek(fd, 0, SEEK_SET);
read(fd, data, 5);
Also you should add error checks for all these system calls (open, read, write, lseek).
Hello Guys I really need help in my c/c++ programming skills. I have to load a binary file, maybe a simple "hello World" and execute it directly from a buffer. Therefore I loaded my buffer with the binary file and tried to set the programming pointer to the buffer. But it doesn't work correctly. Could you please help me with useful suggestions?
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
#include <sys/mman.h>
int main(int argc, const char * argv[]) {
FILE *fileptr;
char *buffer;
long filelen;
fileptr = fopen("helloworld", "rb"); //Open File in binary mode
fseek(fileptr, 0, SEEK_END); //Jump to the end of file
filelen = ftell(fileptr); //Get the current byte offset in File
rewind(fileptr); // Jump back to beginnig of the file
buffer = (char *)malloc((filelen+1)*sizeof(char));
fread(buffer, filelen, 1, fileptr);
fclose(fileptr);
int *ret;
ret = (int *)&ret + 2;
(*ret) = (int)*buffer;
}
The program instructions are placed in the read only region of the process memory called text code segment and it is decided when the exe is generated by the compiler. The OS simply won't expect program instructions in the stack or the heap! Otherwise viruses would have been very easy to made....
I've been having a problem that I not been able to solve as of yet. This problem is related to reading files, I've looked at threads even on this website and they do not seem to solve the problem. That problem is reading files that are larger than a computers system memory. Simply when I asked this question a while ago I was referred too using the following code.
string data("");
getline(cin,data);
std::ifstream is (data);//, std::ifstream::binary);
if (is)
{
// get length of file:
is.seekg (0, is.end);
int length = is.tellg();
is.seekg (0, is.beg);
// allocate memory:
char * buffer = new char [length];
// read data as a block:
is.read (buffer,length);
is.close();
// print content:
std::cout.write (buffer,length);
delete[] buffer;
}
system("pause");
This code works well apart from the fact that it eats memory like fat kid in a candy store.
So after a lot of ghetto and unrefined programing, I was able to figure out a way to sort of fix the problem. However I more or less traded one problem for another in my quest.
#include <iostream>
#include <vector>
#include <string>
#include <fstream>
#include <stdio.h>
#include <stdlib.h>
#include <iomanip>
#include <windows.h>
#include <cstdlib>
#include <thread>
using namespace std;
/*======================================================*/
string *fileName = new string("tldr");
char data[36];
int filePos(0); // The pos of the file
int tmSize(0); // The total size of the file
int split(32);
char buff;
int DNum(0);
/*======================================================*/
int getFileSize(std::string filename) // path to file
{
FILE *p_file = NULL;
p_file = fopen(filename.c_str(),"rb");
fseek(p_file,0,SEEK_END);
int size = ftell(p_file);
fclose(p_file);
return size;
}
void fs()
{
tmSize = getFileSize(*fileName);
int AX(0);
ifstream fileIn;
fileIn.open(*fileName, ios::in | ios::binary);
int n1,n2,n3;
n1 = tmSize / 32;
// Does the processing
while(filePos != tmSize)
{
fileIn.seekg(filePos,ios_base::beg);
buff = fileIn.get();
// To take into account small files
if(tmSize < 32)
{
int Count(0);
char MT[40];
if(Count != tmSize)
{
MT[Count] = buff;
cout << MT[Count];// << endl;
Count++;
}
}
// Anything larger than 32
else
{
if(AX != split)
{
data[AX] = buff;
AX++;
if(AX == split)
{
AX = 0;
}
}
}
filePos++;
}
int tz(0);
filePos = filePos - 12;
while(tz != 2)
{
fileIn.seekg(filePos,ios_base::beg);
buff = fileIn.get();
data[tz] = buff;
tz++;
filePos++;
}
fileIn.close();
}
void main ()
{
fs();
cout << tmSize << endl;
system("pause");
}
What I tried to do with this code is too work around the memory issue. Rather than allocating memory for a large file that simply does not exist on a my system, I tried to use the memory I had instead which is about 8gb, but I only wanted to use maybe a few Kilobytes of it if at all possible.
To give you a layout of what I am talking about I am going to write a line of text.
"Hello my name is cake please give me cake"
Basically what I did was read said piece of text letter by letter. Then I put those letters into a box that could store 32 of them, from there I could use something like xor and then write them onto another file.
The idea in a way works but it is horribly slow and leaves off parts of files.
So basically how can I make something like this work without going slow or cutting off files. I would love to see how xor works with very large files.
So if anyone has a better idea than what I have, then I would be very grateful for the help.
To read and process the file piece-by-piece, you can use the following snippet:
// Buffer size 1 Megabyte (or any number you like)
size_t buffer_size = 1<<20;
char *buffer = new char[buffer_size];
std::ifstream fin("input.dat");
while (fin)
{
// Try to read next chunk of data
fin.read(buffer, buffer_size);
// Get the number of bytes actually read
size_t count = fin.gcount();
// If nothing has been read, break
if (!count)
break;
// Do whatever you need with first count bytes in the buffer
// ...
}
delete[] buffer;
The buffer size of 32 bytes, as you are using, is definitely too small. You make too many calls to library functions (and the library, in turn, makes calls (although probably not every time) to OS, which are typically slow, since they cause context-switching). There is also no need of tell/seek.
If you don't need all the file content simultaneously, reduce the working set first - like a set of about 32 words, but since XOR can be applied sequentially, you may further simplify the working set with constant size, like 4 kilo-bytes.
Now, you have the option to use file reader is.read() in a loop and process a small set of data each iteration, or use memmap() to map the file content as memory pointer which you can perform both read and write operations.
This question already has answers here:
How to read line by line after i read a text into a buffer?
(4 answers)
Closed 10 years ago.
I'm trying to ask a similar question to this post:
C: read binary file to memory, alter buffer, write buffer to file
but the answers didn't help me (I'm new to c++ so I couldn't understand all of it)
How do I have a loop access the data in memory, and go through line by line so that I can write it to a file in a different format?
This is what I have:
#include <fstream>
#include <iostream>
#include <string>
#include <sstream>
#include <vector>
#include <stdio.h>
#include <sys/types.h>
#include <sys/stat.h>
#include <unistd.h>
#include <stdlib.h>
using namespace std;
int main()
{
char* buffer;
char linearray[250];
int lineposition;
double filesize;
string linedata;
string a;
//obtain the file
FILE *inputfile;
inputfile = fopen("S050508-v3.txt", "r");
//find the filesize
fseek(inputfile, 0, SEEK_END);
filesize = ftell(inputfile);
rewind(inputfile);
//load the file into memory
buffer = (char*) malloc (sizeof(char)*filesize); //allocate mem
fread (buffer,filesize,1,inputfile); //read the file to the memory
fclose(inputfile);
//Check to see if file is correct in Memory
cout.write(buffer,filesize);
free(buffer);
}
I appreciate any help!
Edit (More info on the data):
My data is different files that vary between 5 and 10gb. There are about 300 million lines of data. Each line looks like
M359
T359 3520 359
M400
A3592 zng 392
Where the first element is a character, and the remaining items could be numbers or characters. I'm trying to read this into memory since it will be a lot faster to loop through line by line, than reading a line, processing, and then writing. I am compiling in 64bit linux. Let me know if I need to clarify further. Again thank you.
Edit 2
I am using a switch statement to process each line, where the first character of each line determines how to format the rest of the line. For example 'M' means millisecond, and I put the next three numbers into a structure. Each line has a different first character that I need to do something different for.
So pardon the potentially blatantly obvious, but if you want to process this line by line, then...
#include <iostream>
#include <fstream>
#include <string>
using namespace std;
int main(int argc, char *argv[])
{
// read lines one at a time
ifstream inf("S050508-v3.txt");
string line;
while (getline(inf, line))
{
// ... process line ...
}
inf.close();
return 0;
}
And just fill in the body of the while loop? Maybe I'm not seeing the real problem (a forest for the trees kinda thing).
EDIT
The OP is inline with using a custom streambuf which may not necessarily be the most portable thing in the world, but he's more interested in avoiding flipping back and forh between input and output files. With enough RAM, this should do the trick.
#include <iostream>
#include <fstream>
#include <iterator>
#include <memory>
using namespace std;
struct membuf : public std::streambuf
{
membuf(size_t len)
: streambuf()
, len(len)
, src(new char[ len ] )
{
setg(src.get(), src.get(), src.get() + len);
}
// direct buffer access for file load.
char * get() { return src.get(); };
size_t size() const { return len; };
private:
std::unique_ptr<char> src;
size_t len;
};
int main(int argc, char *argv[])
{
// open file in binary, retrieve length-by-end-seek
ifstream inf(argv[1], ios::in|ios::binary);
inf.seekg(0,inf.end);
size_t len = inf.tellg();
inf.seekg(0, inf.beg);
// allocate a steam buffer with an internal block
// large enough to hold the entire file.
membuf mb(len+1);
// use our membuf buffer for our file read-op.
inf.read(mb.get(), len);
mb.get()[len] = 0;
// use iss for your nefarious purposes
std::istream iss(&mb);
std::string s;
while (iss >> s)
cout << s << endl;
return EXIT_SUCCESS;
}
You should look into fgets and scanf, in which you can pull out matched pieces of data so it is easier to manipulate, assuming that is what you want to do. Something like this could look like:
FILE *input = fopen("file.txt", "r");
FILE *output = fopen("out.txt","w");
int bufferSize = 64;
char buffer[bufferSize];
while(fgets(buffer,bufferSize,input) != EOF){
char data[16];
sscanf(buffer,"regex",data);
//manipulate data
fprintf(output,"%s",data);
}
fclose(output);
fclose(input);
That would be more of the C way to do it, C++ handles things a little more eloquently by using an istream:
http://www.cplusplus.com/reference/istream/istream/
If I had to do this, I'd probably use code something like this:
std::ifstream in("S050508-v3.txt");
std::istringstream buffer;
buffer << in.rdbuf();
std::string data = buffer.str();
if (check_for_good_data(data))
std::cout << data;
This assumes you really need the entire contents of the input file in memory at once to determine whether it should be copied to output or not. If (for example) you can look at the data one byte at a time, and determine whether that byte should be copied without looking at the others, you could do something more like:
std::ifstream in(...);
std::copy_if(std::istreambuf_iterator<char>(in),
std::istreambuf_iterator<char>(),
std::ostream_iterator<char>(std::cout, ""),
is_good_char);
...where is_good_char is a function that returns a bool saying whether that char should be included in the output or not.
Edit: the size of files you're dealing with mostly rules out the first possibility I've given above. You're also correct that reading and writing large chunks of data will almost certainly improve speed over working on one line at a time.
Here is my current problem: I am trying to create a file of x MB in C++. The user will enter in the file name then enter in a number between 5 and 10 for the size of the file they want created. Later on in this project i'm gonna do other things with it but I'm stuck on the first step of creating the darn thing.
My problem code (so far):
char empty[1024];
for(int i = 0; i < 1024; i++)
{
empty[i] = 0;
}
fileSystem = fopen(argv[1], "w+");
for(int i = 0; i < 1024*fileSize; i++){
int temp = fputs(empty, fileSystem);
if(temp > -1){
//Sucess!
}
else{
cout<<"error"<<endl;
}
}
Now if i'm doing my math correctly 1 char is 1byte. There are 1024 bytes in 1KB and 1024KB in a MB. So if I wanted a 2 MB file, i'd have to write 1024*1024*2 bytes to this file. Yes?
I don't encounter any errors but I end up with an file of 0 bytes... I'm not sure what I'm doing wrong here so any help would be greatly appreciated!
Thanks!
Potentially sparse file
This creates output.img of size 300 MB:
#include <fstream>
int main()
{
std::ofstream ofs("ouput.img", std::ios::binary | std::ios::out);
ofs.seekp((300<<20) - 1);
ofs.write("", 1);
}
Note that technically, this will be a good way to trigger your filesystem's support for sparse files.
Dense file - filled with 0's
Functionally identical to the above, but filling the file with 0's:
#include <iostream>
#include <fstream>
#include <vector>
int main()
{
std::vector<char> empty(1024, 0);
std::ofstream ofs("ouput.img", std::ios::binary | std::ios::out);
for(int i = 0; i < 1024*300; i++)
{
if (!ofs.write(&empty[0], empty.size()))
{
std::cerr << "problem writing to file" << std::endl;
return 255;
}
}
}
Your code doesn't work because you are using fputs which writes a null-terminated string into the output buffer. But you are trying to write all nulls, so it stops right when it looks at the first byte of your string and ends up writing nothing.
Now, to create a file of a specific size, all you need to do is to call truncate function (or _chsiz for Windows) exactly once and set what size you want the file to be.
Good luck!
To make a 2MB file you have to seek to 2*1024*1024 and write 0 bytes. fput()ting empty string will do no good no matter how many time. And the string is empty, because strings a 0-terminated.