Can't load file using fopen() - c++

I creating a program that takes a file and ecrypts it, but now i'am with a problem opening the file to read, the fopen() always return 0.
void run(){
char buffer[260] = { '\0' };
GetWindowTextA(Path,buffer,260);
encryptFile(buffer, "C:\\Users\\DownD\\Desktop\\Some.dat");
}
I think the problem is somewhere on this function run(), because when replace the buffer array with some string for example, "C:\\Somefile.exe" replacing the function encryptFile() for:
encryptFile("C:\\Somefile.exe", "C:\\Users\\DownD\\Desktop\\Some.dat");.It reads the file nice and clean.
Here it is parts of the rest of the project.
int CCrypter::encryptFile(char* filePath, LPCSTR outFile)
{
unsigned char* data = NULL;
int cypherSize;
int fSize = readFile(data, filePath);
if (!fSize)
return 2;
unsigned char *ciphertext = new unsigned char[fSize];
cypherSize = encrypt(data, fSize, ciphertext);
if (!cypherSize)
return 3;
if (!Create_File(ciphertext, cypherSize, outFile))
return 4;
return 1;
}
int CCrypter::readFile(unsigned char *&buffer, const char* path)
{
int lenght = 0;
OutputDebugString(path);
FILE* input = fopen(path, "rb");
if (!input) // Input is always 0
return 0;
fseek(input, 0, SEEK_END);
lenght = ftell(input);
buffer = new unsigned char[lenght];
printf("%d", buffer);
ZeroMemory(buffer, lenght);
rewind(input);
if (!fread(buffer, 1, lenght, input))
return 0;
fclose(input);
return lenght;
}
Just to clarify, i'm using Multi-Byte Character Set

I solved the issue. The problem was that I had opened the file before and did not close it, that was why I was receiving permission denied.

Related

EOF sign in the middle of a textfile [duplicate]

I am writing a XOR encryption program which works fine during encryption but during decryption
the
char ca2=fgetc(f);
gets stuck at one point and no decryption takes place after that my best guess about the problem is (the encrypted file contains all sorts of characters ) as soon as fgetc reaches EOF mark which can be present before the actual end of the file it gets stuck there and stop reading the next characters .
is this some kind of limitation of getc() ? here is my rubbish code
int get_file_size(char filename[])
{
FILE *p_file = NULL;
p_file = fopen(filename,"rb");
fseek(p_file,0,SEEK_END);
int size = ftell(p_file);
fclose(p_file);
return size;
}
int endec(char filename[],char psdw[])
{
FILE *f;
int hashed=0,ed=0;
int inphash=inhash(psdw);
inphash=inphash%50;
f=fopen(filename,"r");
if(f==NULL)
printf("failed");
char temps[999999];
long int crs=0,j=0;
int filesz=get_file_size(filename);
printf("file size = %d\n\n",filesz);
while(1){
inphash=inphash+2;
char ca=(char)inphash;
char ca2=fgetc(f);
printf("%c\n",ca2);
if(crs>=filesz)
break;
temps[crs]= ca2 ^ ca;
crs++;
}
fclose(f);
printf("%d",strlen(temps));
FILE *fp;
fp=fopen(filename,"wt");
for(j=0;j<crs;j++){
putc (temps[j] , fp);
printf("%c",temps[j]);
}
fclose(fp);
}
Your problem is right here:
f=fopen(filename,"r");
You open the file for text reading, not for binary. Your file size function gets it right, but your decoder function does not.
The idiomatic way to read a file character by character using the C-style IO routines is like this:
f = fopen(filename, "rb");
if (!f)
// handle error
int c; // NOTE: int, not char!
while ( (c = fgetc(f)) != EOF )
{
// do something with 'c'
}
This idiom does not require you to get the file size as a separate operation. You can rewrite your XOR "encryption" routine with a simple loop of the above form. It will be much clearer and more concise.
Your entire decoder function could be rewritten as follows: (minus the debug code)
int endec(char filename[], char psdw[])
{
int inphash = inhash(psdw) % 50;
char temp[999999]; // really, should be std::vector<char>
FILE *f;
if ( (f = fopen(filename, "rb")) == NULL )
{
printf("opening for read failed\n");
return -1;
}
size_t crs = 0;
int c;
while ( (c = fgetc(f)) != EOF )
{
inphash += 2;
temp[crs++] = (char)(inphash ^ c);
}
fclose(f);
if ( (f = fopen(filename, "wt")) == NULL )
{
printf("opening for write failed\n");
return -1;
}
if (fwrite(temp, crs, 1, f) != crs)
{
printf("short write\n");
fclose(f);
return -1;
}
fclose(f);
return 0;
}
Not stellar error handling, but it is error handling.

C++: Store read binary file into buffer

I'm trying to read a binary file and store it in a buffer. The problem is, that in the binary file are multiple null-terminated characters, but they are not at the end, instead they are before other binary text, so if I store the text after the '\0' it just deletes it in the buffer.
Example:
char * a = "this is a\0 test";
cout << a;
This will just output: this is a
here's my real code:
this function reads one character
bool CStream::Read (int * _OutChar)
{
if (!bInitialized)
return false;
int iReturn = 0;
*_OutChar = fgetc (pFile);
if (*_OutChar == EOF)
return false;
return true;
}
And this is how I use it:
char * SendData = new char[4096 + 1];
for (i = 0; i < 4096; i++)
{
if (Stream.Read (&iChar))
SendData[i] = iChar;
else
break;
}
I just want to mention that there is a standard way to read from a binary file into a buffer.
Using <cstdio>:
char buffer[BUFFERSIZE];
FILE * filp = fopen("filename.bin", "rb");
int bytes_read = fread(buffer, sizeof(char), BUFFERSIZE, filp);
Using <fstream>:
std::ifstream fin("filename.bin", ios::in | ios::binary );
fin.read(buffer, BUFFERSIZE);
What you do with the buffer afterwards is all up to you of course.
Edit: Full example using <cstdio>
#include <cstdio>
const int BUFFERSIZE = 4096;
int main() {
const char * fname = "filename.bin";
FILE* filp = fopen(fname, "rb" );
if (!filp) { printf("Error: could not open file %s\n", fname); return -1; }
char * buffer = new char[BUFFERSIZE];
while ( (int bytes = fread(buffer, sizeof(char), BUFFERSIZE, filp)) > 0 ) {
// Do something with the bytes, first elements of buffer.
// For example, reversing the data and forget about it afterwards!
for (char *beg = buffer, *end=buffer + bytes; beg < end; beg++, end-- ) {
swap(*beg, *end);
}
}
// Done and close.
fclose(filp);
return 0;
}
static std::vector<unsigned char> read_binary_file (const std::string filename)
{
// binary mode is only for switching off newline translation
std::ifstream file(filename, std::ios::binary);
file.unsetf(std::ios::skipws);
std::streampos file_size;
file.seekg(0, std::ios::end);
file_size = file.tellg();
file.seekg(0, std::ios::beg);
std::vector<unsigned char> vec;
vec.reserve(file_size);
vec.insert(vec.begin(),
std::istream_iterator<unsigned char>(file),
std::istream_iterator<unsigned char>());
return (vec);
}
and then
auto vec = read_binary_file(filename);
auto src = (char*) new char[vec.size()];
std::copy(vec.begin(), vec.end(), src);
The problem is definitievely the writing of your buffer, because you read a byte at a time.
If you know the length of the data in your buffer, you could force cout to go on:
char *bf = "Hello\0 world";
cout << bf << endl;
cout << string(bf, 12) << endl;
This should give the following output:
Hello
Hello world
However this is a workaround, as cout is foreseent to output printable data. Be aware that the output of non printable chars such as '\0' is system dependent.
Alternative solutions:
But if you manipulate binary data, you should define ad-hoc data structures and printing. Here some hints, with a quick draft for the general principles:
struct Mybuff { // special strtucture to manage buffers of binary data
static const int maxsz = 512;
int size;
char buffer[maxsz];
void set(char *src, int sz) // binary copy of data of a given length
{ size = sz; memcpy(buffer, src, max(sz, maxsz)); }
} ;
Then you could overload the output operator function:
ostream& operator<< (ostream& os, Mybuff &b)
{
for (int i = 0; i < b.size; i++)
os.put(isprint(b.buffer[i]) ? b.buffer[i]:'*'); // non printables replaced with *
return os;
}
ANd you could use it like this:
char *bf = "Hello\0 world";
Mybuff my;
my.set(bf, 13); // physical copy of memory
cout << my << endl; // special output
I believe your problem is not in reading the data, but rather in how you try to print it.
char * a = "this is a\0 test";
cout << a;
This example you show us prints a C-string. Since C-string is a sequence of chars ended by '\0', the printing function stops at the first null char.
This is because you need to know where the string ends either by using special terminating character (like '\0' here) or knowing its length.
So, to print whole data, you must know the length of it and use a loop similar to the one you use for reading it.
Are you on Windows? If so you need to execute _setmode(_fileno(stdout), _O_BINARY);
Include <fcntl.h> and <io.h>

stack check fail in sha-1 c++

I'm having a __stack_chk_fail in the main thread.
I have no idea why is this happening?
I got the codes from this website:
http://www.packetizer.com/security/sha1/
Im trying to add a function to compute the digest of a file using the example.
.h file
#include <stdio.h>
#include <string>
std::string digestFile( char *filename );
.cpp file
std::string SHA1::digestFile( char *filename )
{
Reset();
FILE *fp = NULL;
if (!(fp = fopen(filename, "rb")))
{
printf("sha: unable to open file %s\n", filename);
return NULL;
}
char c = fgetc(fp);
while(!feof(fp))
{
Input(c);
c = fgetc(fp);
}
fclose(fp);
unsigned message_digest[5];
if (!Result(message_digest))
{ printf("sha: could not compute message digest for %s\n", filename); }
std::string hash;
for (int i = 0; i < 5; i++)
{
char buffer[8];
int count = sprintf(buffer, "%08x", message_digest[i]);
if (count != 8)
{ printf("converting unsiged to char ERROR"); }
hash.append(buffer);
}
return hash;
}
__stack_chk_fail occurs when you write to invalid address.
It turns out you do:
char buffer[8];
int count = sprintf(buffer, "%08x", message_digest[i]);
C strings are NUL-terminated. That means that when sprintf writes 8 digits, it adds 9-th char, '\0'. But buffer only has space for 8 chars, so the 9-th goes past the end of the buffer.
You need char buffer[9]. Or do it the C++ way with std::stringstream, which does not involve any fixed sizes and thus no risk of buffer overrun.

How to read past EOF from getc?

I am writing a XOR encryption program which works fine during encryption but during decryption
the
char ca2=fgetc(f);
gets stuck at one point and no decryption takes place after that my best guess about the problem is (the encrypted file contains all sorts of characters ) as soon as fgetc reaches EOF mark which can be present before the actual end of the file it gets stuck there and stop reading the next characters .
is this some kind of limitation of getc() ? here is my rubbish code
int get_file_size(char filename[])
{
FILE *p_file = NULL;
p_file = fopen(filename,"rb");
fseek(p_file,0,SEEK_END);
int size = ftell(p_file);
fclose(p_file);
return size;
}
int endec(char filename[],char psdw[])
{
FILE *f;
int hashed=0,ed=0;
int inphash=inhash(psdw);
inphash=inphash%50;
f=fopen(filename,"r");
if(f==NULL)
printf("failed");
char temps[999999];
long int crs=0,j=0;
int filesz=get_file_size(filename);
printf("file size = %d\n\n",filesz);
while(1){
inphash=inphash+2;
char ca=(char)inphash;
char ca2=fgetc(f);
printf("%c\n",ca2);
if(crs>=filesz)
break;
temps[crs]= ca2 ^ ca;
crs++;
}
fclose(f);
printf("%d",strlen(temps));
FILE *fp;
fp=fopen(filename,"wt");
for(j=0;j<crs;j++){
putc (temps[j] , fp);
printf("%c",temps[j]);
}
fclose(fp);
}
Your problem is right here:
f=fopen(filename,"r");
You open the file for text reading, not for binary. Your file size function gets it right, but your decoder function does not.
The idiomatic way to read a file character by character using the C-style IO routines is like this:
f = fopen(filename, "rb");
if (!f)
// handle error
int c; // NOTE: int, not char!
while ( (c = fgetc(f)) != EOF )
{
// do something with 'c'
}
This idiom does not require you to get the file size as a separate operation. You can rewrite your XOR "encryption" routine with a simple loop of the above form. It will be much clearer and more concise.
Your entire decoder function could be rewritten as follows: (minus the debug code)
int endec(char filename[], char psdw[])
{
int inphash = inhash(psdw) % 50;
char temp[999999]; // really, should be std::vector<char>
FILE *f;
if ( (f = fopen(filename, "rb")) == NULL )
{
printf("opening for read failed\n");
return -1;
}
size_t crs = 0;
int c;
while ( (c = fgetc(f)) != EOF )
{
inphash += 2;
temp[crs++] = (char)(inphash ^ c);
}
fclose(f);
if ( (f = fopen(filename, "wt")) == NULL )
{
printf("opening for write failed\n");
return -1;
}
if (fwrite(temp, crs, 1, f) != crs)
{
printf("short write\n");
fclose(f);
return -1;
}
fclose(f);
return 0;
}
Not stellar error handling, but it is error handling.

Random Bytes added to an end of a buffer

I was making a re-creation of some System.IO functions from that class.
When I setup a buffer and allocate n number of bytes it'll read bytes to that and then add random bytes to the end of that buffer.
For example:
My Main:
int main(int argc, char *args[])
{
SetConsoleTitle(TEXT("Stream Test."));
cout<<"Press any Key to begin reading.";
cin.get();
const char* data = File::ReadAllBytes(args[1]);
Stream* stream = new Stream(data);
char* magic = new char[8];
stream->Read(magic, 0, 8);
magic[8] = '\0';
cout<<magic<<endl<<endl;
delete[]data;
cout<<"Press any key to quit.";
cin.get();
return 0;
}
and here is my System::IO namespace + stream class:
namespace System
{
namespace IO
{
class File
{
public:
static char* ReadAllBytes(const char *name)
{
ifstream fl(name, ifstream::in|ifstream::binary);
fl.seekg( 0, ifstream::end );
size_t len = fl.tellg();
char* ret = new char[len+1];
ret[len] = '\0';
fl.seekg(0);
fl.read(ret, len);
fl.close();
return ret;
}
//not sure of this use yet.
static size_t fileSize(const char* filename)
{
ifstream in(filename, ifstream::in | ifstream::binary);
in.seekg(0, ifstream::end);
return in.tellg();
}
};
class Stream
{
public:
const char *_buffer;
__int64 _origin;
__int64 _position;
__int64 _length;
__int64 _capacity;
bool _expandable;
bool _writable;
bool _exposable;
bool _isOpen;
static const int MemStreamMaxLength = 2147483647;
Stream()
{
InitializeInstanceFields();
}
Stream(const char *buffer)
{
_buffer = buffer;
_length = strlen(_buffer);
_capacity = _length;
_position = 0;
_origin = 0;
_expandable = false;
_writable = true;
_exposable = true;
_isOpen = true;
}
int ReadByte()
{
if (_position >= _length)
return -1;
return _buffer[_position++];
}
void Read(char* &buffer, int offset, int length)
{
if((_position + offset + length) <= _length)
{
memcpy( buffer, _buffer + (_position + offset), length );
_position += length;
}
}
private:
void InitializeInstanceFields()
{
_origin = 0;
_position = 0;
_length = 0;
_capacity = 0;
_expandable = false;
_writable = false;
_exposable = false;
_isOpen = false;
}
};
}
}
This is what ends up happening:
Can anyone explain why this happens, how I can fix, or anything else? I'm new to C++ so any explanations would help. Also please don't criticize my scripting, I know it may be bad, outdated, deprecated, etc. but I'm open to learning and any helping advice goes for the better. :)
You can only use operator << (char *) on C-style strings, not arbitrary arrays of characters. How would you expect it to know how many characters to output?
I would guess the file was not opened correctly and thus the magic buffer is not set at all which leaves it with initialized junk data:
If the constructor is not successful in opening the file, the object
is still created although no file is associated to the stream buffer
and the stream's failbit is set (which can be checked with inherited
member fail).
http://www.cplusplus.com/reference/fstream/ifstream/ifstream/
Try adding more error checking along the way (using cout), especially when opening and reading the buffer. Perhaps set the magic buffer to zero or something recognizable that is overwritten when successful.