I am trying to write a string directly to the console buffer in C++ using WriteConsoleOutputCharacter. I read the contents of a file which contains multiple \n characters but when I write to the console buffer it doesn't write the rest of the file on the next line instead it writes ??. In notepad it says my file encoding is utf-8. I wondered if this had anything to do with it but when I print the file contents using cout the line breaks where ever it encounters \n.
std::ifstream file("debug.txt");
static HANDLE console = GetStdHandle(STD_OUTPUT_HANDLE);
DWORD ir = 0;
if (file.is_open()) {
file.seekg(0, file.end);
int len = file.tellg();
std::cout << len;
file.seekg(0, file.beg);
char* string = (char*)calloc(len+1, sizeof(char));
string[len] = '\0';
file.read(string, len);
file.close();
COORD coord = { 0, 0 };
WriteConsoleOutputCharacterA(
console,
string,
len,
coord,
&ir
);
free(string);
This is how I read from the file and write to the console buffer. Any thoughts?
Related
I have an assignment where I have to implement the Rijndael Algorithm for AES-128 Encryption. I have the algorithm operational, but I do not have proper file input/output.
The assignment requires us to use parameters passed in from the command line. In this case, the parameter will be the file path to the particular file the user wishes to encrypt.
My problem is, I am lost as to how to read in the bytes of a file and store these bytes inside an array for later encryption.
I have tried using ifstream and ofstream to open, read, write, and close the files and it works fine for plaintext files. However, I need the application to take ANY file as input.
When I tried my method of using fstream with a pdf as input, it would crash my program. So, I now need to learn how to take the bytes of a file, store them inside an unsigned char array for Encryption, and then store them inside another file. This process of encryption and storage of ciphertext needs to occur in 16 byte intervals.
The below implementation is my first attempt to read files in binary mode and then write whatever was read in another file also in binary mode.
The output is readable in a hex reader.
int main(int argc, char* argv[])
{
if (argc < 2)
{
cerr << "Use: " << argv[0] << " SOURCE_FILEPATH" << endl << "Ex. \"C\\Users\\Anthony\\Desktop\\test.txt\"\n";
return 1;
}
// Store the Command Line Parameter inside a string
// In this case, a filepath.
string src_fp = argv[1];
string dst_fp = src_fp.substr(0, src_fp.find('.', 0)) + ".enc";
// Open the filepaths in binary mode
ifstream srcF(src_fp, ios::in | ios::binary);
ofstream dstF(dst_fp, ios::out | ios::binary);
// Buffer to handle the input and output.
unsigned char fBuffer[16];
srcF.seekg(0, ios::beg);
while (!srcF.eof())
{
srcF >> fBuffer;
dstF << fBuffer << endl;
}
dstF.close();
srcF.close();
}
The code implementation does not work as intended.
Any direction on how to solve my dilemma would be greatly appreciated.
Like you, I really struggled to find a way to read a binary file into a byte array in C++ that would output the same hex values I see in a hex editor. After much trial and error, this seems to be the fastest way to do so without extra casts.
It would go faster without the counter, but then sometimes you end up with wide chars. To truly get one byte at a time I haven't found a better way.
By default it loads the entire file into memory, but only prints the first 1000 bytes.
string Filename = "BinaryFile.bin";
FILE* pFile;
pFile = fopen(Filename.c_str(), "rb");
fseek(pFile, 0L, SEEK_END);
size_t size = ftell(pFile);
fseek(pFile, 0L, SEEK_SET);
uint8_t* ByteArray;
ByteArray = new uint8_t[size];
if (pFile != NULL)
{
int counter = 0;
do {
ByteArray[counter] = fgetc(pFile);
counter++;
} while (counter <= size);
fclose(pFile);
}
for (size_t i = 0; i < 800; i++) {
printf("%02X ", ByteArray[i]);
}
is
find
function can't work with exe? i try to find wstring in an exe. it always not match. but if i create a txt and copy binary inside exe to txt and it can find it.
std::wifstream file(L"D:/file.exe", std::ios::binary);
if (file.is_open())
{
file.seekg(0, file.end);
std::streamoff length = file.tellg();
file.seekg(0, file.beg);
wchar_t *buffer = new wchar_t[length];
file.read(buffer, length);
std::wstring sFile;
sFile = buffer;
size_t index = sFile.find(L"Something");
if (index != std::string::npos) std::cout << "It's found";
file.close();
delete[] buffer;
}
else
{
std::cout << "It's not open";
}
The executable probably has a number of 0 bytes (i.e. 0x00) early on in the file. When you do sFile = buffer; it assumes that buffer is a C-style string that ends in a 0 byte. So sFile will only contain the bytes up to that point.
To fix it you should put the whole buffer into the string:
std::wstring sFile(buffer, length); // Directly using the constructor, or
sFile.assign(buffer, length); // after construction
Just change
std::wstring sFile;
sFile = buffer;
to
std::wstring sFile(buffer, buffer+length);
When you assigning char-buffer to wstring object, the length of the string is determined by the first null character. So, the first 0x00 byte containing in your file denotes the end of string.
Using fstreams I'm attempting to read single characters from a specified location in a file and append them onto a string. For some reason, reading in these characters returns special characters. I've tried numerous things, but the more curious thing that I found while debugging was that changing the initial value of the char temp; will cause the whole string to change to that value.
int Class::numbers(int number, string& buffer) {
char temp;
if (number < 0 || buffer.length() > size) {
exit(0);
}
string fname = name + ".txt";
int start = number * size;
ifstream readin(fname.c_str());
readin.open(fname.c_str(), ios::in)
readin.seekg(start);
for (int i = 0; i < size; ++i) {
readin.get(temp);
buffer += temp;
}
cout << buffer << endl;
readin.close();
return 0;
}
Here is an example screenshot of the special characters being outputted: http://i.imgur.com/6HCI7TT.png
Could the issue be where I'm starting using seekg? It seems to start in the appropriate position. Another thing I've considered is that maybe I'm reading some invalid place into the stream and it's just giving me junk characters from memory.
Any thoughts?
WORKING SOLUTION:
int Class::numbers(int number, string& buffer) {
char temp;
if (number < 0 || buffer.length() > size) {
exit(0);
}
string fname = name + ".txt";
int start = number * size;
ifstream readin(fname.c_str());
readin.open(fname.c_str(), ios::in)
readin.seekg(start);
for (int i = 0; i < size; ++i) {
readin.get(temp);
buffer += temp;
}
cout << buffer << endl;
readin.close();
return 0;
}
Here is the working solution. In my program I had already had this file name open, so opening it twice was likely to cause issues I suppose. I will do some further testing on this in my own time.
For ASCII characters with a numeric value greater than 127, the actual character rendered on screen depends on the code page of the system you are currently using.
What is likely happening is that you are not getting a single "character" as you think you are.
First, to debug this, use your existing code to just open and print out an entire text file. Is your program capable of doing this? If not, it's likely that the "text" file you are opening isn't using ASCII, but possibly UTF or some other form of encoding. That means when you read a "character" (8-bits most likely), you're just reading half of a 16-bit "wide character", and the result is meaningless to you.
For example, the gedit application will automatically render "Hello World" on screen as I'd expect, regardless of character encoding. However, in a hex editor, a UTF8 encoded file looks like:
UTF8 Raw text:
0000000: 4865 6c6c 6f20 776f 726c 642e 0a Hello world..
While UTF16 looks like:
0000000: fffe 4800 6500 6c00 6c00 6f00 2000 7700 ..H.e.l.l.o. .w.
0000010: 6f00 7200 6c00 6400 2e00 0a00 o.r.l.d.....
This is what your program sees. C/C++ expect ASCII encoding by default. If you want to handle other encodings, it's up to your program to accomodate it manually or by using a third-party library.
Also, you aren't testing to see if you've exceeded the length of the file. You could just be grabbing random garbage.
Using a simple text file just containing the string "Hello World", can your program do this:
Code Listing
// read a file into memory
#include <iostream> // std::cout
#include <fstream> // std::ifstream
#include <string.h>
int main () {
std::ifstream is ("test.txt", std::ifstream::binary);
if (is) {
// get length of file:
is.seekg (0, is.end);
int length = is.tellg();
is.seekg (0, is.beg);
// allocate memory:
char * buffer = new char [length];
// read data as a block:
is.read (buffer,length);
// print content:
std::cout.write (buffer,length);
std::cout << std::endl;
// repeat at arbitrary locations:
for (int i = 0; i < length; i++ )
{
memset(buffer, 0x00, length);
is.seekg (i, is.beg);
is.read(buffer, length-i);
// print content:
std::cout.write (buffer,length);
std::cout << std::endl;
}
is.close();
delete[] buffer;
}
return 0;
}
Sample Output
Hello World
Hello World
ello World
llo World
lo World
o World
World
World
orld
rld
ld
d
I'm currently trying to read the contents of a file into a char array.
For instance, I have the following text in a char array. 42 bytes:
{
type: "Backup",
name: "BackupJob"
}
This file is created in windows, and I'm using Visual Studio c++, so there is no OS compatibility issues.
However, executing the following code, at the completion of the for loop, I get Index: 39, with no 13 displayed prior to the 10's.
// Create the file stream and open the file for reading
ifstream fs;
fs.open("task.txt", ifstream::in);
int index = 0;
int ch = fs.get();
while (fs.good()) {
cout << ch << endl;
ch = fs.get();
index++;
}
cout << "----------------------------";
cout << "Index: " << index << endl;
return;
However, when attempting to create a char array the length of the file, reading the file size as per below results in the 3 additional CR chars attributing to the total filesize so that length is equal 42, which is adding screwing up the end of the array with dodgy bytes.
// Create the file stream and open the file for reading
ifstream fs;
fs.seekg(0, std::ios::end);
length = fs.tellg();
fs.seekg(0, std::ios::beg);
// Create the buffer to read the file
char* buffer = new char[length];
fs.read(buffer, length);
buffer[length] = '\0';
// Close the stream
fs.close();
Using a hex viewer, I have confirmed that file does indeed contain the CRLF (13 10) bytes in the file.
There seems to be a disparity with getting the end of the file, and what the get() and read() methods actually return.
Could anyone please help with this?
Cheers,
Justin
You should open your file in binary mode. This will stop read dropping CR.
fs.open("task.txt", ifstream::in|ifstream::binary);
I'm trying to read a file and output the contents. Everything works fine, I can see the contents but it seems to add about 14 empty bytes at the end. Does anyone know whats wrong with this code?
int length;
char * html;
ifstream is;
is.open ("index.html");
is.seekg (0, ios::end);
length = is.tellg();
is.seekg (0, ios::beg);
html = new char [length];
is.read(html, length);
is.close();
cout << html;
delete[] html;
You didn't put a null terminator on your char array. It's not ifstream reading too much, cout just doesn't know when to stop printing without the null terminator.
If you want to read an entire file, this is much easier:
std::ostringstream oss;
ifstream fin("index.html");
oss << fin.rdbuf();
std::string html = oss.str();
std::cout << html;
That is because html is not null-terminated string, and std::cout keeps printing character until it finds \0, or it may crash your program
Do this:
html = new char [length +1 ];
is.read(html, length);
html[length] = '\0'; // put null at the end
is.close();
cout << html;
Or, you can do this:
cout.write(html, length);
cout.write will stop printing exactly after length number of chars.