I am trying to write a program which will output a XOR encrypted string to a file and will read this string and decrypt it back . To encrypt my string I have used a simple XOR Encryption : (thanks to Kyle W.Banks site)
string encryptDecrypt(string toEncrypt)
{
char key = 'K'; //Any char will work
string output = toEncrypt;
for (int i = 0; i < toEncrypt.size(); i++)
output[i] = toEncrypt[i] ^ key;
return output;
}
Then In my program I use the following code to write and then read the string :
string encrypted = encryptDecrypt("Some text");
cout << "Encrypted:" << encrypted << "\n";
ofstream myFile("test.txt");
myFile << encrypted;
// Read all the txt file in binary mode to obtain the txt file in one string
streampos size;
char * memblock;
ifstream file ("test.txt", ios::in|ios::binary|ios::ate);
if (file.is_open())
{
size = file.tellg();
memblock = new char [size];
file.seekg (0, ios::beg);
file.read (memblock, size);
file.close();
}
//Convert the memblock into string and show the result of decrypted string
string result(memblock);
string decrypted = encryptDecrypt(result);
cout << "Decrypted:" << decrypted << "\n";
In result I have :
Encrypted : ,<.c%.;%
Decrypted : Õ52E65AD0
Maybe to save the file cause some problems into the byte saved so It can't retrieve the same byte when the program tried to read the string, but I'm not sure at all.
Best regards
Since you're not closing the output, there's a fair chance that your OS won't let you open the file for reading.
You're decrypting regardless of whether the file was successfully read.
If it wasn't successfully read, you'll have undefined behaviour due to memblock not being initialised - most likely getting a result constructed from random garbage data.
Once you get that fixed, you need to zero-terminate memblock to make it a "proper" C-style string.
Encryption with XOR is kind of dangerous. Assume your plain text contains the letter 'K', the encrypted string will contain a '\0' at this position. Your string will be cut off there.
Same thing for the other direction, you are reading the encrypted string. Converting the memory block to a string will result in a shorter string because std::string::string(const char*) will stop reading at '\0'.
Apart from that, memblock isn't initialized when the file could not be opened, so put the encryption part into the if (file.IsOpen()) clause.
As said by Zuppa it is dangerous to use it that way the string may terminate unexpectedly due to '\0'
you should post - calculate the length of the text you are dealing with it can be easily done by using stream.seekg(0,ios_base::end)
and you can use read and write functions to write or get the text from the file
ifstream file ("test.txt", ios::in|ios::binary|ios::ate);
file.seekg(0,ios::end);
int length=file.tellg();//length of the text in the file
file.seekg(0);
char *memblock=new char[length];
file.read(memblock,length);
you may consult this Simple xor encryption
Related
I have an assignment where I have to implement the Rijndael Algorithm for AES-128 Encryption. I have the algorithm operational, but I do not have proper file input/output.
The assignment requires us to use parameters passed in from the command line. In this case, the parameter will be the file path to the particular file the user wishes to encrypt.
My problem is, I am lost as to how to read in the bytes of a file and store these bytes inside an array for later encryption.
I have tried using ifstream and ofstream to open, read, write, and close the files and it works fine for plaintext files. However, I need the application to take ANY file as input.
When I tried my method of using fstream with a pdf as input, it would crash my program. So, I now need to learn how to take the bytes of a file, store them inside an unsigned char array for Encryption, and then store them inside another file. This process of encryption and storage of ciphertext needs to occur in 16 byte intervals.
The below implementation is my first attempt to read files in binary mode and then write whatever was read in another file also in binary mode.
The output is readable in a hex reader.
int main(int argc, char* argv[])
{
if (argc < 2)
{
cerr << "Use: " << argv[0] << " SOURCE_FILEPATH" << endl << "Ex. \"C\\Users\\Anthony\\Desktop\\test.txt\"\n";
return 1;
}
// Store the Command Line Parameter inside a string
// In this case, a filepath.
string src_fp = argv[1];
string dst_fp = src_fp.substr(0, src_fp.find('.', 0)) + ".enc";
// Open the filepaths in binary mode
ifstream srcF(src_fp, ios::in | ios::binary);
ofstream dstF(dst_fp, ios::out | ios::binary);
// Buffer to handle the input and output.
unsigned char fBuffer[16];
srcF.seekg(0, ios::beg);
while (!srcF.eof())
{
srcF >> fBuffer;
dstF << fBuffer << endl;
}
dstF.close();
srcF.close();
}
The code implementation does not work as intended.
Any direction on how to solve my dilemma would be greatly appreciated.
Like you, I really struggled to find a way to read a binary file into a byte array in C++ that would output the same hex values I see in a hex editor. After much trial and error, this seems to be the fastest way to do so without extra casts.
It would go faster without the counter, but then sometimes you end up with wide chars. To truly get one byte at a time I haven't found a better way.
By default it loads the entire file into memory, but only prints the first 1000 bytes.
string Filename = "BinaryFile.bin";
FILE* pFile;
pFile = fopen(Filename.c_str(), "rb");
fseek(pFile, 0L, SEEK_END);
size_t size = ftell(pFile);
fseek(pFile, 0L, SEEK_SET);
uint8_t* ByteArray;
ByteArray = new uint8_t[size];
if (pFile != NULL)
{
int counter = 0;
do {
ByteArray[counter] = fgetc(pFile);
counter++;
} while (counter <= size);
fclose(pFile);
}
for (size_t i = 0; i < 800; i++) {
printf("%02X ", ByteArray[i]);
}
I'm trying to make a programm, which will read the file, change specified word to symbols '#' and write back to same file. But I have problems with that.
1st question.
It seems like I need to store file in buffer before writing it to the file. How should I do it?
2nd question:
I cant understand why loop in this code never ends? It's about 200 words in that file, but I always get memory exception and i gets 10075.
int main(int argc, char* argv[]){
char** temp = new char*[10000];
int i = 0;
fstream fTemp("D:\doc.txt", ios_base::in);
while (!fTemp.eof()){
temp[i] = new char[50];
fTemp >> temp[i];
temp[i][1] = '#';
cout << temp[i] << endl;
i++;
}
fTemp.open("D:\doc.txt", ios_base::trunc);
for (int i = 0; i < sizeof(*temp); i++){
fTemp << temp[i];
}
_getch();
}
First, you should use getline as your usage of eof is incorrect (eof bit is set only after failed read).
Next, store the strings in a std::vector<string>. This will allow you to not care about memory management (current one is leaking) and provide a more flexible solution.
std::string buffer;
std::vector<string> data;
while(std::getline(fTemp,buffer)) {
data.push_back(buffer);
}
The problem you probably have, is the incorrect eof() call, buy you should check you cout output to determine the problem with this code.
to store the data of file in a buffer, you can get the size of file and use the function read to get all file data. see this code:
// Load file in a buffer
ifstream fl(FileName);
fl.seekg(0, ios::end);
size_t len = fl.tellg();
char* fdata = new char[len];
fl.seekg(0, ios::beg);
fl.read(fdata, len);
fl.close();
in your case the same fstream that you used to open are being used to write without close the file before reopen.
Your loop never ends because it is a pointer, and it size isn't managed, the better way is get the size of file while it is open, in this case the size of file is the "size_t len".
to rewrite your code you can create another stream, see this code:
// Write File
ofstream flOut(FileName, ios_base::trunc);
flOut.write(fdata, len);
flOut.close();
between these two codes above, you can change the data of fdata, but what exactly you wanna make? is replace some word to symbol '#'? which word?
I'm fairly new with C++ and am trying to read and write binary file. I have used the read and write functions to read text from one file and output it to a new file. However the following characters always appear at the end of the created text file "ÌÌ". Is a particular character indicating the end of file being saved in the character buffer?
int main(){
ifstream myfile("example.txt", ios::ate);
ofstream outfile("new.txt");
ifstream::pos_type size;
char buf [1024];
if(myfile.is_open()){
size=myfile.tellg();
cout<<"The file's size is "<<(int) size<<endl;
myfile.seekg(0,ios::beg);
while(!myfile.eof()){
myfile.read(buf, sizeof(buf));
}
outfile.write(buf,size);
}
else
cout<<"Error"<<endl;
myfile.close();
outfile.close();
cin.get();
return 0;
}
Not the only problem with your code (try it on a file bigger than 1024 bytes) but since you are doing binary I/O you need
ifstream myfile("example.txt", ios::ate|ios::binary);
ofstream outfile("new.txt", ios::binary);
I'm currently trying to read the contents of a file into a char array.
For instance, I have the following text in a char array. 42 bytes:
{
type: "Backup",
name: "BackupJob"
}
This file is created in windows, and I'm using Visual Studio c++, so there is no OS compatibility issues.
However, executing the following code, at the completion of the for loop, I get Index: 39, with no 13 displayed prior to the 10's.
// Create the file stream and open the file for reading
ifstream fs;
fs.open("task.txt", ifstream::in);
int index = 0;
int ch = fs.get();
while (fs.good()) {
cout << ch << endl;
ch = fs.get();
index++;
}
cout << "----------------------------";
cout << "Index: " << index << endl;
return;
However, when attempting to create a char array the length of the file, reading the file size as per below results in the 3 additional CR chars attributing to the total filesize so that length is equal 42, which is adding screwing up the end of the array with dodgy bytes.
// Create the file stream and open the file for reading
ifstream fs;
fs.seekg(0, std::ios::end);
length = fs.tellg();
fs.seekg(0, std::ios::beg);
// Create the buffer to read the file
char* buffer = new char[length];
fs.read(buffer, length);
buffer[length] = '\0';
// Close the stream
fs.close();
Using a hex viewer, I have confirmed that file does indeed contain the CRLF (13 10) bytes in the file.
There seems to be a disparity with getting the end of the file, and what the get() and read() methods actually return.
Could anyone please help with this?
Cheers,
Justin
You should open your file in binary mode. This will stop read dropping CR.
fs.open("task.txt", ifstream::in|ifstream::binary);
I'm trying to read a file and output the contents. Everything works fine, I can see the contents but it seems to add about 14 empty bytes at the end. Does anyone know whats wrong with this code?
int length;
char * html;
ifstream is;
is.open ("index.html");
is.seekg (0, ios::end);
length = is.tellg();
is.seekg (0, ios::beg);
html = new char [length];
is.read(html, length);
is.close();
cout << html;
delete[] html;
You didn't put a null terminator on your char array. It's not ifstream reading too much, cout just doesn't know when to stop printing without the null terminator.
If you want to read an entire file, this is much easier:
std::ostringstream oss;
ifstream fin("index.html");
oss << fin.rdbuf();
std::string html = oss.str();
std::cout << html;
That is because html is not null-terminated string, and std::cout keeps printing character until it finds \0, or it may crash your program
Do this:
html = new char [length +1 ];
is.read(html, length);
html[length] = '\0'; // put null at the end
is.close();
cout << html;
Or, you can do this:
cout.write(html, length);
cout.write will stop printing exactly after length number of chars.