fgets only read the first line of file - c++

I am trying to read a file from my .cpp file. I am using C libraries, so do not confuse on that.
So the problem is as clear as what I said in the title. fgets method can read the first line but when it comes to the second line, it cannot read neither the second line nor the rest of the file (since it exits when a problem occurs).
You can find the associated part of code:
void read_input()
{
int i = 0, N = 5;
char str[STR_SIZE], line[STR_SIZE];
FILE *fp;
fp = fopen("out", "r");
if (!fp)
{
fprintf(stderr, "error: file could not be opened\n");
exit(1);
}
for (i = 0; i<2; i++)
{
if (fgets(str, STR_SIZE, fp) == NULL)
{
fprintf(stderr, "error: failed at file reading\n");
exit(1);
}
if (feof(fp))
{
fprintf(stderr, "error: not enough lines in file\n");
exit(1);
}
if ((sscanf(str, "%s", line) != 1) )
{
fprintf(stderr, "error: invalid file format\n");
exit(1);
}
printf("%d\t%s\n", i, line);
fclose(fp);
}
}

I believe, the problem is there, because you've used fclose(fp); inside the loop. So, after the very first iteration, the fp is passed to fclose() and for any recurring use of fp in any further iteration will invoke undefined behavior as the fp is not valid anymore.
Solution: Move the fclose(fp); outside the loop.

You are closing the file in the loop! Put the fclose function outside of the loop.
for (i = 0; i<2; i++)
{
....
printf("%d\t%s\n", i, line);
fclose(fp); // <-- here, move out of the loop.
}

Related

C++ remove() and rename() gives "Permission error"

I can't figure out what is happening in my program, it's a simple function to clear all the Windows '\r' from a file, putting all the chars in another file and then rename it to substitute the old file. Every time I execute the function the rename() and remove() functions give me "Permission error" even if I had all the file pointers closed and the file on my PC is closed in every program. Here's the code
static bool correctFile(string fileName) {
string name = fileName;
FILE* test = fopen(fileName.c_str(), "rb");
FILE *in, *out;
char stringTest[1000];
bool isWinFile = false;
if (!test) {
return false;
}
fread(stringTest, 1, 1000, test);
fclose(test);
for (size_t i = 0; i < strlen(stringTest) && !isWinFile; i++) {
if (stringTest[i] == '\r') {
isWinFile = true;
}
}
if (isWinFile) {
in = fopen(fileName.c_str(), "rb");
string tempFile = name + ".temp";
out = fopen(tempFile.c_str(), "wb");
if (!in || !out) {
return false;
}
char temp;
while (fread(&temp, sizeof(temp), 1, in) > 0) {
if (temp != '\r') {
fwrite(&temp, sizeof(temp), 1, out);
}
}
fclose(in);
fclose(out);
if (std::remove(fileName.c_str())) {
std::cerr << "Error: " << strerror(errno);
return false;
}
if (std::rename(tempFile.c_str(), fileName.c_str())) {
return false;
}
}
return true;
}
If you find an error in this please tell me, thanks
I found out that the old file and the new file must not be in the same folder for some reason
Disable the virus scanner and see if the problem persists. Some antivirus products block access to files for a couple of microseconds after they have been written or modified. I know of Kaspersky for example.
I have a "Antivirus" retry loop in some of my batch files because of this. So one (ugly) solution is to retry the rename/remove operations a couple of times.

EOF sign in the middle of a textfile [duplicate]

I am writing a XOR encryption program which works fine during encryption but during decryption
the
char ca2=fgetc(f);
gets stuck at one point and no decryption takes place after that my best guess about the problem is (the encrypted file contains all sorts of characters ) as soon as fgetc reaches EOF mark which can be present before the actual end of the file it gets stuck there and stop reading the next characters .
is this some kind of limitation of getc() ? here is my rubbish code
int get_file_size(char filename[])
{
FILE *p_file = NULL;
p_file = fopen(filename,"rb");
fseek(p_file,0,SEEK_END);
int size = ftell(p_file);
fclose(p_file);
return size;
}
int endec(char filename[],char psdw[])
{
FILE *f;
int hashed=0,ed=0;
int inphash=inhash(psdw);
inphash=inphash%50;
f=fopen(filename,"r");
if(f==NULL)
printf("failed");
char temps[999999];
long int crs=0,j=0;
int filesz=get_file_size(filename);
printf("file size = %d\n\n",filesz);
while(1){
inphash=inphash+2;
char ca=(char)inphash;
char ca2=fgetc(f);
printf("%c\n",ca2);
if(crs>=filesz)
break;
temps[crs]= ca2 ^ ca;
crs++;
}
fclose(f);
printf("%d",strlen(temps));
FILE *fp;
fp=fopen(filename,"wt");
for(j=0;j<crs;j++){
putc (temps[j] , fp);
printf("%c",temps[j]);
}
fclose(fp);
}
Your problem is right here:
f=fopen(filename,"r");
You open the file for text reading, not for binary. Your file size function gets it right, but your decoder function does not.
The idiomatic way to read a file character by character using the C-style IO routines is like this:
f = fopen(filename, "rb");
if (!f)
// handle error
int c; // NOTE: int, not char!
while ( (c = fgetc(f)) != EOF )
{
// do something with 'c'
}
This idiom does not require you to get the file size as a separate operation. You can rewrite your XOR "encryption" routine with a simple loop of the above form. It will be much clearer and more concise.
Your entire decoder function could be rewritten as follows: (minus the debug code)
int endec(char filename[], char psdw[])
{
int inphash = inhash(psdw) % 50;
char temp[999999]; // really, should be std::vector<char>
FILE *f;
if ( (f = fopen(filename, "rb")) == NULL )
{
printf("opening for read failed\n");
return -1;
}
size_t crs = 0;
int c;
while ( (c = fgetc(f)) != EOF )
{
inphash += 2;
temp[crs++] = (char)(inphash ^ c);
}
fclose(f);
if ( (f = fopen(filename, "wt")) == NULL )
{
printf("opening for write failed\n");
return -1;
}
if (fwrite(temp, crs, 1, f) != crs)
{
printf("short write\n");
fclose(f);
return -1;
}
fclose(f);
return 0;
}
Not stellar error handling, but it is error handling.

Debug Assertion Failed VS2010

I'm making a very simple program to read in from a text file and print the contents. When the file finishes compiling I keep getting this debug assertion failed message!
I've never seen it before and can't seem to find any solutions.
(It won't let me post an image because my rep isn't high enough!)
The code
#include <stdio.h>
#include <stdlib.h>
int main()
{
FILE *file = fopen("C:\\Users\Kyne\\Desktop\\AdvProgrammingAssignment\\employees.txt", "r");
char c;
do
{
c = fgetc(file);
printf("%c", c);
}
while(c != EOF);
fclose(file);
return 0;
printf("\n\n\n");
system("pause");
}
Step through your code using the debugger to find the line that is causing the debug assertion, and check to see if the file is opened.
In the line
FILE *file = fopen("C:\\Users\Kyne\\Desktop\\AdvProgrammingAssignment\\employees.txt", "r");
it looks like you missed a '\' before 'Kyne' so it should be
FILE *file = fopen("C:\\Users\\Kyne\\Desktop\\AdvProgrammingAssignment\\employees.txt", "r");
There are other issues like calling return 0; before the end of the main block.
I don't see any checks if file was opened properly. Also, I would check for EOF mark before first read - use while and feof() instead. Finally, these lines:
printf("\n\n\n");
system("pause");
will never get called, as you do return 0 after fclose() - move it [return 0] to the end.
Try this:
int main()
{
FILE *file = fopen("C:\\Users\\Kyne\\Desktop\\AdvProgrammingAssignment\\employees.txt", "r");
if(!file)
{
printf("File could not be opened!\n");
return -1;
}
while(!feof(file))
{
char c = fgetc(file);
printf("%c", c);
}
fclose(file);
printf("\n\n\n");
system("pause");
return 0;
}
Most likely, your error originated in using FILE* set to NULL - you have one slash missing after \\Users, so file probably was never opened and fopen() was constantly returning NULL.

How to read past EOF from getc?

I am writing a XOR encryption program which works fine during encryption but during decryption
the
char ca2=fgetc(f);
gets stuck at one point and no decryption takes place after that my best guess about the problem is (the encrypted file contains all sorts of characters ) as soon as fgetc reaches EOF mark which can be present before the actual end of the file it gets stuck there and stop reading the next characters .
is this some kind of limitation of getc() ? here is my rubbish code
int get_file_size(char filename[])
{
FILE *p_file = NULL;
p_file = fopen(filename,"rb");
fseek(p_file,0,SEEK_END);
int size = ftell(p_file);
fclose(p_file);
return size;
}
int endec(char filename[],char psdw[])
{
FILE *f;
int hashed=0,ed=0;
int inphash=inhash(psdw);
inphash=inphash%50;
f=fopen(filename,"r");
if(f==NULL)
printf("failed");
char temps[999999];
long int crs=0,j=0;
int filesz=get_file_size(filename);
printf("file size = %d\n\n",filesz);
while(1){
inphash=inphash+2;
char ca=(char)inphash;
char ca2=fgetc(f);
printf("%c\n",ca2);
if(crs>=filesz)
break;
temps[crs]= ca2 ^ ca;
crs++;
}
fclose(f);
printf("%d",strlen(temps));
FILE *fp;
fp=fopen(filename,"wt");
for(j=0;j<crs;j++){
putc (temps[j] , fp);
printf("%c",temps[j]);
}
fclose(fp);
}
Your problem is right here:
f=fopen(filename,"r");
You open the file for text reading, not for binary. Your file size function gets it right, but your decoder function does not.
The idiomatic way to read a file character by character using the C-style IO routines is like this:
f = fopen(filename, "rb");
if (!f)
// handle error
int c; // NOTE: int, not char!
while ( (c = fgetc(f)) != EOF )
{
// do something with 'c'
}
This idiom does not require you to get the file size as a separate operation. You can rewrite your XOR "encryption" routine with a simple loop of the above form. It will be much clearer and more concise.
Your entire decoder function could be rewritten as follows: (minus the debug code)
int endec(char filename[], char psdw[])
{
int inphash = inhash(psdw) % 50;
char temp[999999]; // really, should be std::vector<char>
FILE *f;
if ( (f = fopen(filename, "rb")) == NULL )
{
printf("opening for read failed\n");
return -1;
}
size_t crs = 0;
int c;
while ( (c = fgetc(f)) != EOF )
{
inphash += 2;
temp[crs++] = (char)(inphash ^ c);
}
fclose(f);
if ( (f = fopen(filename, "wt")) == NULL )
{
printf("opening for write failed\n");
return -1;
}
if (fwrite(temp, crs, 1, f) != crs)
{
printf("short write\n");
fclose(f);
return -1;
}
fclose(f);
return 0;
}
Not stellar error handling, but it is error handling.

Lingering open files resulting in "Too many open files"

I have code using boost to list directory contents, iterate through each file, and do some data processing stuff. The results are being printed to an output file ('histFile').
After ~2555 files have been processed, I get the error:
boost::filesystem::directory_iterator::construct: Too many open files: "/Users/.../.../.../directory_with_files"
My code is:
for(int i = 0; i < 10000; i++) {
FILE *histFile;
string outputFileName = "somename";
bool ifRet = initFile(histFile, outputFileName.c_str(), "a"); // 1
fclose(histFile); // 2
}
If I comment out the last two lines above ('1' and '2'), the code finishes fine. Thus it seems copies of 'histFile' are being left open, but I don't understand how! This is the operative part of the method:
bool initFile(FILE *&ofFile, const char *fileName, const char *openType, int overwriteOption) {
if(overwriteOption < 0 || overwriteOption > 2) {
fprintf(stderr, "ERROR: ToolBox - initFile() : unknown 'overwriteOption' (%d), setting to (0)!\n", overwriteOption);
}
// Read-Only
if(openType == "r") {
if(ofFile = fopen(fileName, "r")) { return true; }
fprintf(stderr, "ERROR: Could not open file (%s)!\n", fileName);
return false;
}
// Appending:
if(openType == "a" || openType == "a+") {
// Check if file already exists
if(!fopen(fileName, "r")){
fprintf(stderr, "ERROR: (%s) File does not Exist, cannot append!\n", fileName);
return false;
}
if(ofFile = fopen(fileName, openType)) { return true; }
}
// Writing:
// if file already exists
if(FILE *temp = fopen(fileName, "r")){
if(overwriteOption == 2) {
fprintf(stderr, "ERROR: (%s) File Exists!\n", fileName);
return false;
}
if(overwriteOption == 1) {
}
if(overwriteOption == 0) {
char backupFileName[TB_CHARLIMIT], backupPrefix[TB_CHARLIMIT];
strcpy(backupFileName, fileName); // copy filename
// create a prefix w/ format '<YYYYMMDD>BACKUP_'
DateTime now;
sprintf(backupPrefix, "%s", now.getDateStr().c_str());
strcat(backupPrefix, "BACKUP_");
// add to copied filename, and move file
strcpy(backupFileName, prependFileName(backupFileName, backupPrefix));
moveFile(fileName, backupFileName);
}
fclose(temp);
}
if(ofFile = fopen(fileName, openType)) { return true; }
// Default: Return error and false
fprintf(stderr, "ERROR: Could not open file (%s)!\n", fileName);
return false;
}
Am I doing something wrong with pointers/references?
Any help greatly appreciated!
You're leaking a handle in this bit of code when you're testing if the file exists already:
// Appending:
if(openType == "a" || openType == "a+") {
// Check if file already exists
if(!fopen(fileName, "r")){ // <-- the FILE* opened here is leaked
fprintf(stderr, "ERROR: (%s) File does not Exist, cannot append!\n", fileName);
return false;
}
if(ofFile = fopen(fileName, openType)) { return true; }
}
Is there really a reason to make that check? Why not just let the file be created if it doesn't already exist?