Segmentation fault error - C programming - c++

I am receiving a Segmentation Fault error when I run this program. To summarize, the program reads in multiple data files (129 of them). Each file contains information about a specific name. The information includes how many people were named a specific name in a specific year and the gender. For now, I am trying to store each name in a linked list. Whenever I read in more than about 4 data files, I get the Segmentation Fault error. I have written this program in Java, which was much simpler. If anybody could simply point me in the right direction, as I am almost certain this has to do with memory allocation, but I cannot seem to solve this myself. Thank you.
#include <iostream>
#include <fstream>
#include <stdio.h>
#include <string.h>
#include <cstdlib>
using namespace std;
//typedef struct Node NodeStruct;
struct Node {
char *namePtr;
Node *nextPtr;
};
int main() {
// Declare variables
Node *headPtr = NULL;
Node *tempPtr;
Node *currentPtr;
// Variables for reading in a file
FILE *filePtr;
char fileName[20];
int i;
int nameLength;
char inputLine[81];
cout << "Reading from data files, please be patient...\n";
// Loop through files
for (i = 1880; i <= 2009; i++) {
sprintf(fileName, "data/yob%d.txt", i);
filePtr = fopen(fileName, "r"); // Open the file
// Check to ensure file was opened
if(filePtr == NULL) {
cout << "Error opening input file...check location of data files\n";
exit(-1); // Exit program
} // End if statement
while (fscanf(filePtr, "%s", inputLine) != EOF) {
// Create a node
tempPtr = (Node *) malloc(sizeof(Node));
tempPtr->nextPtr = NULL;
// Set the head pointer of first node
if (headPtr == NULL) {
headPtr = tempPtr;
currentPtr = tempPtr;
} // End if statement
// Link the list
currentPtr->nextPtr = tempPtr;
currentPtr = currentPtr->nextPtr;
// Create pointer variables
char *startPtr = inputLine;
char *endPtr = NULL;
endPtr = strchr(inputLine, ','); // Point to end of name
int length = endPtr - inputLine; // Calculate length
// Create space for the name
tempPtr->namePtr = (char *) malloc(sizeof(length + 1));
strncpy(tempPtr->namePtr, startPtr, length); // Store pointer to name
// cout << tempPtr->namePtr << endl;
}
} // End of for (i = 1880...
cout << "Done reading from data files...\n";
} // End of main function

Surely
tempPtr->namePtr = (char *) malloc(sizeof(length + 1));
should be
tempPtr->namePtr = (char *) malloc(length + 1);
since you copy that many characters to the string. sizeof (length + 1) would evaluate to four on a 32-bit machine (eight on 64-bit). Not enough memory was being allocated, so the strncpy which followed this was overwriting memory not belonging to you.

Rather than find your bug, let's try to teach you some practical lessons. Here are my C++ rules for C programmers:
1) Don't use pointers to track your data. The standard containers work just fine for that.
2) Don't use pointers to manage your strings. The standard string type works just fine for that.
3) Don't use pointers for anything else, until you need to learn how polymorphism works.
4) Don't use malloc at all. Ever.
5) Don't use new, hardly at all.
The neat thing about not using pointers (or arrays), is that you will never make pointer bugs. No more buffer overflows, no more segmentation faults. Joy!
Here is my translation of your program into idiomatic C++. Because I let std::list do all of the list management, all of the silly headPtr, nextPtr, etc, goes away. Because I let std::string do all of the string management, I don't need to malloc(strlen()) (or fix my bug and malloc(strlen()+1). Because I use the RAII idiom, I don't have to worry about closing my files.
It all just works.
#include <iostream>
#include <fstream>
#include <string>
#include <list>
#include <cstdlib>
#include <sstream>
using std::string;
using std::cout;
using std::list;
using std::ifstream;
using std::stringstream;
int main() {
// Declare variables
list<string> list;
cout << "Reading from data files, please be patient...\n";
// Loop through files
for (int i = 1880; i <= 2009; i++) {
stringstream fileName;
fileName << "data/yob" << i << ".txt";
ifstream filePtr(fileName.str().c_str());
if(!filePtr.is_open()) {
cout << "Error opening input file: " << fileName.str() << " ...check location of data files\n";
exit(-1); // Exit program
}
string inputLine;
while (filePtr >> inputLine) {
list.push_back(inputLine);
}
} // End of for (i = 1880...
cout << "Done reading from data files...\n";
} // End of main function

I know this one is already answered, but you should also be in the habit of closing your files. For you with something like fclose(filePtr). By the way if you learn how to use std::ifstream then you don't have to close, it auto closes when the std::ifstream goes out of scope.

Related

Can't understand why I'm getting a segfault

I'm using very little memory on the stack, and I have no recursion, and all my memory access is on the stack. So why am I getting a segfault?
#include <iostream>
#include <string>
#include <stdio.h>
using namespace std;
int main(int argc, char *argv[]){
FILE *file = fopen("test.cpp", "r");
struct item{
char *type;
int price;
bool wanted;
};
item items[100]; char *temp;
if (file)
cout << "works up to here" << endl;
fscanf(file,
"%s, %[for sale wanted], %d",
items[0].type,
temp,
&items[0].price);
}
It prints out
works up to here
Segmentation fault (core dumped)
You are passing pointers to fscanf that are not initialized. You need to do something like this:
(if you are using C)
FILE* file = fopen(...);
char* str = malloc(N);
fscanf(file, "%s", str);
printf("Read %s\n", str);
free(str);
fclose(file);
(if you are actually using C++)
std::ifstream file(...);
std::string str;
file >> str;
std::cout << "Read " << str << std::endl;
The scanf() functions won't allocate any memory. From the looks of it you are passing uninitialized pointer to fscanf() where the function expects arrays of sufficient size instead.
Most likely you'd use something like
items[0].type = new char[100];
char temp[20];
if (3 == fscanf("%100s, %[for sale wanted], %d",
items[0].type,
temp,
&items[0].price)) {
// deal with a read item
}
else {
// deal with an input error
}
(I'm not sufficiently familiar with fscanf() to be confident about the middle format specifier).
Did you check if the file pointer is not null?
This is from fopen reference doc:
If the file is successfully opened, the function returns a pointer to
a FILE object that can be used to identify the stream on future
operations.
Otherwise, a null pointer is returned.
And as Kevin mentioned, this is more like C than C++
I see couple of problems.
You are not checking whether fopen() was successful.
You are trying to read into items[0].type, which has not been initialized to point to anything valid.
You will be better off using std::ifstream and std::string instead of using FILE* and char*.

Using strtok/strtok_r in a while loop in C++

I'm getting unexpected behavior from the strtok and strtrok_r functions:
queue<string> tks;
char line[1024];
char *savePtr = 0;
while(true)
{
//get input from user store in line
tks.push(strtok_r(line, " \n", &savePtr)); //initial push only works right during first loop
char *p = nullptr;
for (...)
{
p = strtok_r(NULL, " \n", &savePtr);
if (p == NULL)
{
break;
}
tks.push(p);
}
delete p;
savePtr = NULL;
//do stuff, clear out tks before looping again
}
I've tried using strtok and realized that during the second loop, the initial push is not occurring. I attempted to use the reentrant version strtok_r in order to control what the saved pointer is pointing to during the second loop by making sure it is null before looping again.
tks is only correctly populated during the first time through the loop - subsequent loops give varying results depending on the length of line
What am I missing here?
Just focusing on the inner loop and chopping off all of the stuff I don't see as necessary.
#include <iostream>
#include <queue>
#include <string>
#include <cstring>
using namespace std;
int main()
{
std::queue<std::string> tks;
while(true)
{
char line[1024];
char *savePtr;
char *p;
cin.getline(line, sizeof(line));
p = strtok_r(line, " \n", &savePtr); // initial read. contents of savePtr ignored
while (p != NULL) // exit when no more data, which includes an emtpy line
{
tks.push(p); // got data, store it
p = strtok_r(NULL, " \n", &savePtr); // get next token
}
// consume tks
}
}
I prefer the while loop over the for loop used by Toby Speight in his answer because I think it is more transparent and easier to read. Your mileage may vary. By the time the compiler is done with it they will be identical.
There is no need to delete any memory. It is all statically allocated. There is no need to clear anything before the next round except for tks. savePtr will be reset by the first strtok_r.
There is a failure case if the user inputs more than 1024 characters on a line, but this will not crash. If this still doesn't work, look into how you're consuming tks. It's not posted so we can't troubleshoot that portion.
Wholeheartedly recommend changing to a string-based solution if possible. This is a really simple, easy to write, but slow, one:
#include <iostream>
#include <queue>
#include <string>
#include <sstream>
int main()
{
std::queue<std::string> tks;
while(true)
{
std::string line;
std::getline(std::cin, line);
std::stringstream linestream(line);
std::string word;
// parse only on ' ', not on the usual all whitespace of >>
while (std::getline(linestream, word, ' '))
{
tks.push(word);
}
// consume tks
}
}
Your code wouldn't compile for me, so I fixed it:
#include <iostream>
#include <queue>
#include <string>
#include <cstring>
std::queue<std::string> tks;
int main() {
char line[1024] = "one \ntwo \nthree\n";
char *savePtr = 0;
for (char *p = strtok_r(line, " \n", &savePtr); p;
p = strtok_r(nullptr, " \n", &savePtr))
tks.push(p);
// Did we read it correctly?
for (; tks.size() > 0; tks.pop())
std::cout << ">" << tks.front() << "<" << std::endl;
}
This produces the expected output:
>one<
>two<
>three<
So your problem isn't with the code you posted.
If you have the option to use boost, try this one out to tokenize a string. Of course by providing your own string and delimeters.
#include <vector>
#include <boost/algorithm/string.hpp>
int main()
{
std::string str = "Any\nString\nYou want";
std::vector< std::string > results;
boost::split( results, str, boost::is_any_of( "\n" ) );
}

Why is there a question mark at the end of the string?

Please help me to figure out why there is a question mark at my string output. I have been working on this simple reverse a string practice. My code can do it correctly. Then I try to store the reversed string into an array and then convert this array to a string. So here is something wired happened. There is always a question mark at the end of that string. Please teach me the reason of it, and how to get rid of this question mark. Here is my code. Thank you so much
#include <string>
using namespace std;
int main()
{
cout<<"Please enter a string."<<endl;
string str;
cin >> str;
int i=0;
int length = str.length();
char arr[length];
//cout<<length;
while (length != 0) {
arr[i] = str.at(length-1);
length--;
cout<<arr[i];
++i;
}
cout<<endl;
string stri(arr);
cout<<endl<<stri<<endl;
cout<<stri[4];
return 0;
}
A string in C (or many other languages) needs to be terminated by a '\0'. After all, you don't know how large the memory your char* points to is. So you need char[length + 1]. Also, variable-length arrays aren't part of C++. You need to use new[]/delete[] or malloc()/free():
char * arr = new char[length + 1]; // enough space for string + '\0'
char[length] = '\0'; // terminate the string with '\0'
while (length != 0) {
// ... // reverse
}
cout << endl;
string stri(arr); // create the new string
delete[] arr; // don't forget to deallocate the memory
However, if you're dropping down to manual memory allocation, you're usually missing something from the standard library. And indeed, you could simply use the right constructor (it's (4), simplified below):
template <class InputIt>
string::string(InputIt first, InputIt last);
And luckily, std::string provides input iterators that traverse the string backwards via std::string::rbegin() and std::string::rend(). Now your code gets a lot easier:
#include <iostream>
#include <string>
int main()
{
std::cout << "Please enter a string." << std::endl;
std::string str;
std::cin >> str;
std::cout << std::endl;
// Create the new reversed string directly:
std::string reversed_string(str.rbegin(), str.rend());
std::cout << reversed_string << std::endl;
return 0;
}
char arr[length];
should be
char arr[length + 1];
EDIT: or rather (as Jonathan Potter points out, since length is not a compile time constant, your code likely only compiles because this is permitted by your specific compiler, e.g. GNU C++):
char *arr = new char[length + 1];
(and delete [] arr; at some point)
to store the terminating '\0':
arr[length] = '\0';

Dynamic Memory Allocation with reading file in

I am trying to use dynamic memory for this project. I am getting a seg fault but I cannot figure out what I am doing incorrectly. Can anyone point to where my mistake is? The file seems to read in correctly...but im assuming the fault is a rogue pointer..help!
I am just trying to read in "heart two 2" to "spade ace 11" in from a file, all words seperated by a space. my program worked before using dynamic memory..
#include <iostream>
#include <fstream>
#include <ctime>
#include <stdlib.h>
#include <string>
using namespace std;
//global constant(s)
const int maxCards = 52;
//Structs
struct card
{
char *suit;
char *rank;
int cvalue;
char location;
};
void readDeck(card* deckPtr);
void cardsInit(char *finNameP,card *deckPtr);
//program
int main()
{
card *deckPtr = new card[52];
char *finNameP = new char[13];
strcopy(finNameP,"cardFile.txt");
cardsInit(finNameP,deckPtr); // function i wrote that works
readDeck(deckPtr); //simply reads the deck from &deckPtr[0] -> &deck[51]
delete [] finNameP;
}
void cardsInit(char *finNameP, card *deckPtr)
{
//set up card file to be read in
ifstream fin;
cout << "Please enter file name...(cardFile.txt)" << endl;;
cin >> *finNameP;
fin.open(finNameP);
//create pointer and set initial value
card *deckHome = deckPtr;
for(int i=0;i<52;i++)
{
(*deckPtr).suit = new char[9];
(*deckPtr).rank = new char[9];
deckPtr++;
}
deckPtr = deckHome;
//check if cardFile.txt opens correctly
if(!fin.good())
{
cout << "Error with card file" << endl;
}
else
{
while(fin.good())
{
for(deckPtr = &deckPtr[0]; deckPtr < &deckPtr[maxCards];deckPtr++)
{
fin >> (*deckPtr).suit;
fin >> (*deckPtr).rank;
fin >> (*deckPtr).cvalue;
}
}
}
fin.close();
delete []finNameP;
delete [] (*deckPtr).suit;
delete [] (*deckPtr).rank;
}
This is a really ancient way to program. Instead of using new, use std::string or std::vector<char>. Those also use dynamic memory but they make it much harder for you to accidentally cause memory allocation bugs.
The first problem comes here:
cin >> *finNameP;
Since finNameP has type char *, then *finNameP has type char. So this instruction reads a single character. Then you go onto do fin.open(finNameP); which causes undefined behaviour because there is no string in finNameP.
The simplest fix is to make finNameP be a std::string. Note that doing cin >> finNameP (without changing the type) would compile, however it is a bad idea because there is no buffer overflow protection. You could write cin >> setw(12) >> finNameP; but that is still substantially worse than using a string.
deckPtr < &deckPtr[maxCards] is always true, the for loop runs forever.

Problems using pointers c++

#include <iostream>
#include <fstream>
#include <cstring>
#include <map>
using namespace std;
int main()
{
cout << "Hello world!" << endl;
//define a bool to determine if you're still in the file's header section or not
bool header = true;
//define the map as a multidimensional string array that stores up to 100 z-levels
//and 50x50 tiles per z level
char* worldmap[100][50][50];
int zLevel=0;
//header declaration map
map<string, char*> declarations;
//create an input file stream object
ifstream file;
//open file
file.open("map1.dmm");
//validate file
if(!file.good()){return 1;}
//begin reading the file
while(!file.eof()){
//create a character array to write to
char line[512];
//read the file line by line; write each line to the character array defined above
file.getline(line, 512);
//header check
if(header){
if(!line[0]){
header = false;
break;
}else{
bool declaringKey=true;
char* key={};
char* element={};
char* token[20]={};
token[0] = strtok(line, "\"()");
for(unsigned int n = 0;n<20;n++){
if(n>0)token[n] = strtok(NULL, "\"()");
//cout << token[0] << endl;
if(!token[n] || (token[n])[1] == '=')continue;
if(declaringKey){
key = token[n];
declaringKey=false;
}else{
//cout << "pow" <<endl;
element = token[n];
cout<<"declarations[" << key << "] = " << element << endl;
declarations.emplace(key,element); //<-------------- problem line, i think
cout << declarations[key] << endl;
}
}declaringKey=true;
}
}else{
if(!line[0]) {
zLevel++;
continue;
}
}
}
//string test = "aa";
return 0;
}
I'm trying to create a map loader that loads a simple map from a text file. I'm aware that there are other map loaders available but most of them do far more than I need them to. So far, this code only reads the header, which basically defines what each set of characters represents as a tile, for example: "aa" = "sand tile"
The problem is, when I'm emplacing the key/element into the declarations map, it seems to use the same element for all keys. I'm assuming that this is because by defining a character pointer it always points to the same data, and only serves the purpose of changing the value contained by that pointer, rather than allocating new data and keeping them separate.
But that raises another question, why does it emplace a different key with the same element, even though both are pointers..? Anyways,
How can I make it so that all keys/elements are independent character arrays, rather than pointers to the exact same space carved out by the array..?
EDIT: You can just assume the code works other than the fact that it stores the same element to all keys. Also, the element it stores is always the last one that's parsed out of the file's header section.
Just use a std::string for the value, too. This should solve your immediate problem.
That said, do not use stream.eof() to control a loop reading values! It does not work. Instead, always try to read from a stream and then verify if the read was successful, e.g.:
while (file.getline(line, sizeof(line))) {
// ...
}
Personally, I wouldn't read into a fixed size buffer and use a std::string instead:
for (std::string line; std::getline(file, line); ) {
// ...
}
From this point I would also not use strtok() but rather either the members of std::string or suitable algorithms. This way I also wouldn't let astray, considering it a good idea to store pointers (not to mention that I can't deal with pointers and, thus, my programs don't use them).