I have a question about structures, as I was programming some code and after I added a structure the program always crash, so I isolated a small part of it and I found that struct STUDENT is the cause of it.
when I declare the array of STUDENT students [MAX] in main the program crashes and I have no idea why, the program only runs when I
1.Change subjects2 in STUDENT to not a array. However I do need to store more than one subject that belongs to a student
2.Declare STUDENT in main not as a array. I need an array as I have to store possibly a large amount of students.
Is there possibly somewhere wrong with my declaration? I kindly ask for some help, thank you in advance.
#include<iostream>
using namespace std;
const int MAX = 100;
enum Grade {HDist, Dist, Credit, Pass, Fail};
struct assessment_Task
{
char Title_Name[MAX];
int Weight;
int Mark;
double A_Mark;
};
struct SUBJECT
{
char subject_Code[MAX];
char subject_Title[MAX];
int No_Assess_Task;
assessment_Task AT [MAX];
int finalMark;
Grade grade;
};
struct STUDENT
{
char Name[MAX];
int ID;
char Subjects_Taken[2][50];
SUBJECT subjects2 [MAX];
};
int main()
{
STUDENT students[MAX];
}
As John3136 has answered, you're likely blowing up the stack that caused the crash.
I believe you can't well utilize that much elements in each array, and since you're using C++, I recommend using STL containers that helps you manage memory better.
First, replace all your char[] arrays to std::string. It has many handy features in addition to memory management. Even if you really need a C-style string, you can call str.c_str() to have one.
Then, replace all those arrays with std::vector. This is the well-known dynamic array container in C++. It uses dynamic memory allocation to arrange the array and won't blow up the stack like your current code is doing.
Doing some rough calculations based on 32 bits ints, no padding etc - basically some simple assumptions
assessment_Task is about 112 bytes
Subject is about 11412 bytes
Student is about 1141404 bytes (over 1 meg)
Therefore your of 100 students is over 100 meg
So that's over 100 Meg you are trying to put on the stack. Depending on the OS, your stack size is probably up to 8 meg or so (without special compiler optiions to set it). To confirm, try making the array of students a smaller size (e.g. 1)
Basically you need to reduce MAX or use a different MAX for different parts to reduce the size. Obviously the best solution is using vectors or similar and dynamically allocating the structs.
See Is there a limit of stack size of a process in linux and C/C++ maximum stack size of program for more info on stack size.
Related
i have some complex classes in my xcode project (below a generic example)
and it seems I have hit some sort of data size limit.
the array sizes I need do not work, if I reduze the array sizes the code works (so no programming errors), but it is too small for what I planned.
reading through the internet I figured out it must be a problem with stack size and most of the solutions say "convert your static arrays to dynamic arrays".
but (1) that is not that easy with multidimensional arrays (some up to 5 to 10 dimensions as they monitor multiple independent variables and each combination is possible)
and (2) are most of the arrays nested in several classes, making it even worse.
I thought already of reducing the data
int instead of long with some intelligent transposition...
change resolution of c (0-100%) into steps of 10% (so [100] reduces to [10])
but on one hand this might jeopardize the overall results and on the other is the project still at the start so it will grow in the next month... this array size problem will come back sooner or later...
here I generalized the code showing a 4 dimensional array (2x 2D).
I guess most professional programs use arrays that are even bigger.
so there must be a way to make this works...
//.h
class StatisticTable
{
public:
long Array1 [100][50];
long Array2 [100][50];
long Array3 [100][140];
};
class Statistic
{
public:
void WriteStatistic(short Parameter_a, short Parameter_b,
short Parameter_c, short Parameter_d);
short ReadStatistic(short Parameter_a, short Parameter_b,
short Parameter_c, short Parameter_d);
private:
StatisticTable Table[16][8];
};
//.cpp
void WriteStatistic(short a, short b, short c, short d)
{
for (int i=0; i<d, i++) {Table[a][b].Array1[c][i]++;}
for (int i=d; i<50, i++) {Table[a][b].Array2[c][i]++;}
//write some more stuff
return;
}
Can you use heap allocation instead of stack allocation?
As suggested, using std::unique_ptr:
auto const ptr = std::unique_ptr<StatisticTable>(new StatisticTable()).get(); // heap allocated and deleted automatically when obj goes out of scope
I.e.
auto obj = new StatisticTable(); // heap allocation, allocate reference to new StatisticTable object on heap
// code
delete obj; // release heap allocated object
vs.
auto x = StatisticTable() // stack allocation
I have seen segmentation fault sometimes during the initialization of an array with huge size.
For ex:
#include<iostream>
#include<limits>
using namespace std;
int main()
{
string h;
cin >> h;
int size=h.size();
cout << size << endl;
int arr[size][size];
cout << arr[0][0]<<endl;
arr[0][0]=1;
cout << arr[0][0]<<endl;
return 0;
}
When the user input is a small string lets say "sample" the program is working fine.
When the user input is a big string where the size is for ex. >1500.Segmentation is seen during the initialization of the array int arr[size][size];
What can be the issue?Is there any problem in initializating the array like the one above.
I think you are out of memory with those initializations, causing a stack overflow. I recommend to allocate it on the heap or by using a std:vector. See here: Segmentation fault on large array sizes
I think an array's size must always be a compile-time constant in C++ i.e. the value of your 'size' variable must be known at compile time.
If you want dynamic storage, use std::vector
MSDN states that the default stack size on Windows is 1 MB - in case of 1500 elements in each dimension your array would take up 1500 * 1500 * 4 bytes = 9000000 bytes = 8.58 megabytes, not sure about Linux (this states it to be 8 MB) - I guess it depends on the compiler and distributive. So either:
1) If you know that there is a limit for the string length increase the stack size accordingly with the /STACK linker flag on Windows or like posted in this answer on Linux
2) Allocate the array on heap - if you don't want to mess around with memory allocations std::vector or std::unique_ptr can be used as a container
I guess I'm still not understanding the limitations of C++ containers and arrays. According to this post and this It is impossible to store items of dynamic size in an STL vector.
However with the following code I can dynamically re-size an element of a vector with the results one would expect if it was ok to have items of varying and changing size in a vector.
string test = "TEST";
vector<string> studentsV;
for (int i = 0; i < 5; ++i)
{
studentsV.push_back(test);
}
studentsV[2].resize(100);
for (string s : studentsV)
{
cout << s << "end" << endl;
}
Result:
TESTend
TESTend
TEST
end
TESTend
TESTend
I can re-size the string element to any size, and it works fine. I can also do the same with a regular C-style array. So, what is the difference between the above posts and what I am doing, and can you give an example of what "dynamic item size" really means, because apparently I am not understanding.
A std::string uses dynamic memory to increase the size of the string being stored. This is not what those articles are talking about.
What they mean, is that sizeof(std::string) is constant. The actual object representing a std::string will always have the same size, but it might do additional allocations in another part of memory.
A std::vector is really just a friendly wrapper around a dynamically-sized array. The definition of an array in C or C++ is a contiguous block of memory where all elements are of equal size.
can you give an example of what "dynamic item size" really means, because apparently I am not understanding.
This is the core of your question.
Namely: if all C++ classes (even ones that manage dynamic memory as part of their implementations) have a fixed and known footprint size via sizeof()...just what sort of thing is it that you can't put in a std::vector?
Since something like a std::string and a std::bitset are classes of different sizes, you couldn't have a vector of [string string bitset string bitset string]. But the type system already wouldn't let you do that. So that can't be what they're talking about.
They're just saying there's no hook for supporting structures like this from the C world:
struct packetheader {
int id;
int filename_len;
};
struct packet {
struct packetheader h;
char filename[1];
};
You couldn't make a std::vector<packet> and expect to find some parameter to push_back letting you specify a per-item size. You'd lose any data you'd allocated outside of the structure boundary.
So to use something like that, you'd have to do std::vector<packet*> and store pointers.
The size of std::string is not dynamic. std::string is probably implemented with a pointer to a dynamically allocated memory. This makes sizeof(std::string) static and possibly different from the size of the actual string.
I have a simple class called tire. Now I want to dynamically allocate the number of tires for a vehicle when a vehicle object is created. For this, I want to create an array of tire-class objects with size equal to the number of tires. To check my code, I would like to print the number of objects in the tire-class array.
The question is: Is there a function which can check how many elements are in my tire class array? Can I use the sizeof() function?
Here is the code:
#include <iostream>
// create a class for the tires:
class TireClass {
public:
float * profileDepths;
};
// create class for the vehicle
class vehicle {
public:
int numberOfTires;
TireClass * tires;
int allocateTires();
};
// method to allocate array of tire-objects
int vehicle::allocateTires() {
tires = new TireClass[numberOfTires];
return 0;
};
// main function
int main() {
vehicle audi;
audi.numberOfTires = 4;
audi.allocateTires();
// check if the correct number of tires has been allocated
printf("The car has %d tires.", sizeof(audi.tires));
// free space
delete [] audi.tires;
return 0;
};
No, there's none. Consider using std::vector. Or just store tires count in some other variable (maybe numberOfTires is good enough?).
Well, what happens when you run the code? Does it change if you compile in 32 or 64 bit mode, if you have the facility?
What's happening is that you're asking the compiler to tell you the storage size (in bytes) needed to hold the tires variable. This variable has type TyreClass*, so the storage size is that needed for a data pointer: this might be anything, but today it will probably be 4 bytes for a 32-bit system, or 8 bytes for a 64-bit system.
Whilst it's possible to use sizeof to tell you the size of a statically allocated array, it's not possible for dynamic (heap) allocation. The sizeof operator (in C++, at least) works at compile time, whereas dynamically allocating memory is done when your programme runs.
Much better, for all sorts of reasons, would be to use a std::vector<TyreClass> to hold your tyres. You can then easily get the number of tyres stored, and don't have to worry about allocating or deallocating arrays yourself.
(EDIT: Gah, forgive me mixing up english/american spellings of tyre/tire. It's late and I'm tyred.)
I am writing a c++ code and my need is to declare a class having two elments as
class arr{
public:
long num;
string str;
};
now i need to store almost 1000000 elments of this class(depending on user input number of class object can warry in a range of 1 <= n <= 1000000
The object are created dynamically as
#include <iostream>
#include<string>
using namespace std;
class arr{
public:
long i;
string str;
};
int main(){
long n,j,i;
cin>>n;
arr a[n];
.... rest of programme
but if value of n is large then 100000 then programs hang but works fine for value less then 100000 what approach should i try to declare more than 100000 objects in a go i tried solving issue with help of 2D array that is dividing arra in two part
arr a[1000][1000];
but this approach is not working for me
please if anybody have any idea do help me out with this
thanks in advance
Just use std::vector
#include <iostream>
#include <vector>
int main(){
long n;
cin>>n;
std::vector<arr> a(n);
}
what approach should i try to declare more than 100000 objects?
is to declare them on the heap not on the stack, you're declaring 1000000 object on the stack, that's toooooo much. Be careful, there's stack limit:
C/C++ maximum stack size of program
why is stack memory size so limited?
to declare them on the heap, use:
arr *arr= new arr[n];
that's the same mechanism the vector uses to initialize elements.
Here is some background information on why allocating too large objects on the stack fails with a segmentation fault, while the same amount of space may be allocated on the stack in many small chunks:
The mechanic is, that the system sets up the stack with a small amount of memory allocated. This allocation is guarded by an inaccessible memory region. Whenever the program accesses this inaccessible region, the hardware alarms the kernel; the kernel analyses the situation, deduces that the program needs more stack memory, grows the stack by a few memory pages, and tells the hardware to normally continue executing the program.
This works perfectly as long the accessed address is close enough to the top of the stack, so that the kernel may safely assume that the application indeed wanted to draw up a new stack frame, instead of accessing some random memory location due to a bug. In your case, the limit seems to be around 16 MiB.
You should be able to use much more than 16 MiB of memory, if none of your stack frames is larger than 16 MiB, because then the limit will be raised in many small step, and the system won't think that you just accessed a random location.