I found a C++ source file which calculates expressions from a command line argument (argv[1]), however I now want to change it to read a file.
double Utvardering(char* s) {
srcPos = s;
searchToken();
return PlusMinus();
}
int main(int argc, char* argv[]) {
if (argc > 1) {
FILE* fFile = fopen(argv[1], "r");
double Value = Utvardering(fopen(argv[1], "r"));
cout << Value << endl;
}else{
cout << "Usage: " << argv[0] << " FILE" << endl;
}
cin.get();
return 0;
}
However the Utvardering function requires a char* parameter. How can I convert the data read from a file, fopen to a char*?
The function fopen just opens a file. To get a string from there, you need to read the file. There are different ways to to this. If you know the max size of your string in advance, this would do:
const int MAX_SIZE = 1024;
char buf[MAX_SIZE];
if (!fgets(buf, MAX_SIZE, fFile) {
cerr << "Read error";
exit(1);
}
double Value = Utvardering(buf);
Note: this method is typical for C, not for C++. If you want more idiomatic C++ code, you can use something like this (instead of FILE and fopen):
ifstream in;
in.open(argv[1]);
if (!in) { /* report an error */ }
string str;
in >> str;
Use the fread() function to read data from the FILE* into a buffer. Send that buffer into Utvardering().
I have no idea what "Utvardering" expects, or how it's using the information.
There are two possibilities -
1) Utvardering may be defined using char*, but expecting a FILE* (in effect, treating char* like void*). I've seen this before, even though it's pretty awful practice. In that case, just cast fFile to char* and pass it in.
2) Utvardering may be expecting a null terminated string (char*) as input. If you're using fopen like this, you can use fread to read the file contents into a buffer (char[]), and pass it to your function that takes a char*.
It looks like you need to write code to read the file into a character array and pass that to Utvardering.
Just passing the return value of fopen will cause the address of the opaque data structure pointed to by that pointer to be passed to Utvardering. Utvardering will happily treat those bytes as character data when they are not. Not good.
Good example of reading data from a file here:
http://www.cplusplus.com/reference/clibrary/cstdio/fread/
then pass the buffer to your function
Related
I have an assignment where I have to implement the Rijndael Algorithm for AES-128 Encryption. I have the algorithm operational, but I do not have proper file input/output.
The assignment requires us to use parameters passed in from the command line. In this case, the parameter will be the file path to the particular file the user wishes to encrypt.
My problem is, I am lost as to how to read in the bytes of a file and store these bytes inside an array for later encryption.
I have tried using ifstream and ofstream to open, read, write, and close the files and it works fine for plaintext files. However, I need the application to take ANY file as input.
When I tried my method of using fstream with a pdf as input, it would crash my program. So, I now need to learn how to take the bytes of a file, store them inside an unsigned char array for Encryption, and then store them inside another file. This process of encryption and storage of ciphertext needs to occur in 16 byte intervals.
The below implementation is my first attempt to read files in binary mode and then write whatever was read in another file also in binary mode.
The output is readable in a hex reader.
int main(int argc, char* argv[])
{
if (argc < 2)
{
cerr << "Use: " << argv[0] << " SOURCE_FILEPATH" << endl << "Ex. \"C\\Users\\Anthony\\Desktop\\test.txt\"\n";
return 1;
}
// Store the Command Line Parameter inside a string
// In this case, a filepath.
string src_fp = argv[1];
string dst_fp = src_fp.substr(0, src_fp.find('.', 0)) + ".enc";
// Open the filepaths in binary mode
ifstream srcF(src_fp, ios::in | ios::binary);
ofstream dstF(dst_fp, ios::out | ios::binary);
// Buffer to handle the input and output.
unsigned char fBuffer[16];
srcF.seekg(0, ios::beg);
while (!srcF.eof())
{
srcF >> fBuffer;
dstF << fBuffer << endl;
}
dstF.close();
srcF.close();
}
The code implementation does not work as intended.
Any direction on how to solve my dilemma would be greatly appreciated.
Like you, I really struggled to find a way to read a binary file into a byte array in C++ that would output the same hex values I see in a hex editor. After much trial and error, this seems to be the fastest way to do so without extra casts.
It would go faster without the counter, but then sometimes you end up with wide chars. To truly get one byte at a time I haven't found a better way.
By default it loads the entire file into memory, but only prints the first 1000 bytes.
string Filename = "BinaryFile.bin";
FILE* pFile;
pFile = fopen(Filename.c_str(), "rb");
fseek(pFile, 0L, SEEK_END);
size_t size = ftell(pFile);
fseek(pFile, 0L, SEEK_SET);
uint8_t* ByteArray;
ByteArray = new uint8_t[size];
if (pFile != NULL)
{
int counter = 0;
do {
ByteArray[counter] = fgetc(pFile);
counter++;
} while (counter <= size);
fclose(pFile);
}
for (size_t i = 0; i < 800; i++) {
printf("%02X ", ByteArray[i]);
}
I'm using very little memory on the stack, and I have no recursion, and all my memory access is on the stack. So why am I getting a segfault?
#include <iostream>
#include <string>
#include <stdio.h>
using namespace std;
int main(int argc, char *argv[]){
FILE *file = fopen("test.cpp", "r");
struct item{
char *type;
int price;
bool wanted;
};
item items[100]; char *temp;
if (file)
cout << "works up to here" << endl;
fscanf(file,
"%s, %[for sale wanted], %d",
items[0].type,
temp,
&items[0].price);
}
It prints out
works up to here
Segmentation fault (core dumped)
You are passing pointers to fscanf that are not initialized. You need to do something like this:
(if you are using C)
FILE* file = fopen(...);
char* str = malloc(N);
fscanf(file, "%s", str);
printf("Read %s\n", str);
free(str);
fclose(file);
(if you are actually using C++)
std::ifstream file(...);
std::string str;
file >> str;
std::cout << "Read " << str << std::endl;
The scanf() functions won't allocate any memory. From the looks of it you are passing uninitialized pointer to fscanf() where the function expects arrays of sufficient size instead.
Most likely you'd use something like
items[0].type = new char[100];
char temp[20];
if (3 == fscanf("%100s, %[for sale wanted], %d",
items[0].type,
temp,
&items[0].price)) {
// deal with a read item
}
else {
// deal with an input error
}
(I'm not sufficiently familiar with fscanf() to be confident about the middle format specifier).
Did you check if the file pointer is not null?
This is from fopen reference doc:
If the file is successfully opened, the function returns a pointer to
a FILE object that can be used to identify the stream on future
operations.
Otherwise, a null pointer is returned.
And as Kevin mentioned, this is more like C than C++
I see couple of problems.
You are not checking whether fopen() was successful.
You are trying to read into items[0].type, which has not been initialized to point to anything valid.
You will be better off using std::ifstream and std::string instead of using FILE* and char*.
This question already has answers here:
How to write std::string to file?
(4 answers)
Closed 2 months ago.
I am trying to get strings from cin and than write it to binary file.
I've read that writing pure string won't work, so I tried to convert it to char*.
The following code writes it ok (...probably), but only the first 8 chars, so the output in file is incomplete.
std::string nick, ip, port;
std::cout << "IP: ";
std::cin >> ip;
std::cout << "port: ";
std::cin >> port;
ofstream file1("lastServers.bin", ios::out | ios::binary);
if (file1.good())
{
const char* p_IP = ip.c_str();
const char* p_PORT = port.c_str();
int length = sizeof(&p_IP)+sizeof(&p_PORT);
char* tmp1 = new char[length];
int index = 0;
memcpy((tmp1 + index), p_IP, sizeof(&p_IP));
index = index + sizeof(&p_IP);
memcpy((tmp1 + index), p_PORT, sizeof(&p_PORT));
file1.write(tmp1, length);
file1.close();
delete[] tmp1;
}
else
{
std::cout << "file error write" << endl;
}
Thanks in advance for any help :)
Your code can be written as
ofstream file1("lastServers.bin", ios::out | ios::binary);
if (file1.good()) {
file1.write(ip.c_str(), ip.size());
file1.write(port.c_str(), port.size());
file1.close();
}
else {
std::cout << "file error write" << endl;
}
string::c_str() returns a const pointer to the text in the string.
string::size() returns the number of characters in the string.
You don't need to concatenate the data before writing to the file, writing one then the other has the same result.
If you wanted to write C type code rather than C++, you can use strlen(p_IP) to get the length of the IP string rather than using sizeof.
The sizeof operator gives you the size of the class instance, i.e. the size of the object BUT the string object's size is never affected by the size of the string it manages.
In C++, objects that manage something (think strings managing characters, containers managing their contents, etc.) usually have a method to determine the size of what they're managing. For std::string and other STL containers that method is size().
Note that writing these strings in this format means you can't tell where one string ends and another one starts. Two options to consider are using a terminating character that you know won't appear in any strings, or writing the length of the string to the file before the text of the string itself. I won't elaborate here as it was not asked in the original question.
sizeof returns you the size of the string object in the memory, not the length of the string itself. Specifically, sizeof(&p_IP) returns the size of the pointer to p_IP, which is always 4 bytes on a 32-bit system. Your variable length simply does not compute to the correct value. To get the length of a char*, use strlen.
I'm trying to read character by character from a text file until EOF, put them into a character array, so that I can manipulate it after. Compiled with g++ without errors, and when run, I'm prompted for the input file but then it just hangs.
int main (int argc, char *argv[]) {
string filename;
ifstream infile;
char *cp, c[1024];
memset (c, 0, sizeof(c));
cp = c;
cout << "Enter file name: " << endl;
cin >> filename;
//open file
infile.open( filename.c_str() );
//if file can't open
if(!infile) {
cerr << "Error: file could not be opened" << endl;
exit(1);
}
while (!infile.eof()); {
infile.get(c, sizeof(infile));
// get character from file and store in array c[]
}
}//end main
You should try the istream::read() method rather than get(). This will help resolve any buffer overruns:
unsigned int chars_read = 0;
//...
// Read in the file.
if (!infile.read(c, sizeof(c))
{
// Handle the read error here.
// Also check for EOF here too.
}
// Obtain the number of characters actually read.
chars_read = infile.gcount();
First off, you don't want to test for eof()! Somehow I start to feel like Don Quixote having found my windmills. However, I do know that you need to check that the input was successful after trying to read it because before attempting to read the stream can't know whether it will be successful.
You program actually doesn't hang! It just waits for you to enter sizeof(infile) characters or end the input (e.g., using Ctrl-D on UNIXes and Ctrl-Z on Windows). Of course, this may look remarkable like a hanging program. You can verify that this is, indeed, the problem by using a smaller size, e.g., 4. Of course, sizeof(infile) is nearly as good as a small random number: It is the size of an object of type std::ifstream and who can tell what that is? You probably meant to use sizeof(c) to make sure that the call to get(c, n) won't write more character than can fit into c.
Try this:
int cont = 0;
while(infile.good()) {
c[cont++] = infile.get();
}
I really didn't find a satisfied answer at google and I/O in C++ is a little bit tricky. I would like to read text file by blocks into a vector if possible. Alas, I couldn't figure out how. I am not even sure, if my infinite loop will be break in all possibilities, because I/O is tricky. So, the best way I was able to figure out is this:
char buffer[1025]; //let's say read by 1024 char block
buffer[1024] = '\0';
std::fstream fin("index.xml");
if (!fin) {
std::cerr << "Unable to open file";
} else {
while (true) {
fin.read(buffer, 1024);
std::cout << buffer;
if (fin.eof())
break;
}
}
Please, note the second line with '\0'. Is it not odd? Can I do something better? Can I read the data into the vector instead of char array? Is it appropriate to read into vector directly?
Thanks for your answers.
PS. Reading by chunks have sense indeed. This code is short but I am storing it in cyclic buffer.
You should be fine doing the following
vector<char> buffer (1024,0); // create vector of 1024 chars with value 0
fin.read(&buffer[0], buffer.size());
The elements in a vector are guaranteed to be stored contiguously, so this should work - but you should ensure that the vector is never empty. I asked a similar question here recently - check the answers to that for specific details from the standard Can I call functions that take an array/pointer argument using a std::vector instead?
std::ifstream fin("index.xml");
std::stringstream buffer;
buffer << fin.rdbuf();
std::string result = buffer.str();
Exactly what you need.
Recently, I have encountered the same problem. I use read and gcount founction to solve it. It works well. Here is the code.
vector<string> ReadFileByBlocks(const char* filename)
{
vector<string> vecstr;
ifstream fin(filename, ios_base::in);
if (fin.is_open())
{
char* buffer = new char[1024];
while (fin.read(buffer, 1024))
{
string s(buffer);
vecstr.push_back(s);
}
// if the bytes of the block are less than 1024,
// use fin.gcount() calculate the number, put the va
// into var s
string s(buffer, fin.gcount());
vecstr.push_back(s);
delete[] buffer;
fin.close();
}
else
{
cerr << "Cannot open file:" << filename << endl;
}
return vecstr;
}