how to declare class with 1000000 elements c++ - c++

I am writing a c++ code and my need is to declare a class having two elments as
class arr{
public:
long num;
string str;
};
now i need to store almost 1000000 elments of this class(depending on user input number of class object can warry in a range of 1 <= n <= 1000000
The object are created dynamically as
#include <iostream>
#include<string>
using namespace std;
class arr{
public:
long i;
string str;
};
int main(){
long n,j,i;
cin>>n;
arr a[n];
.... rest of programme
but if value of n is large then 100000 then programs hang but works fine for value less then 100000 what approach should i try to declare more than 100000 objects in a go i tried solving issue with help of 2D array that is dividing arra in two part
arr a[1000][1000];
but this approach is not working for me
please if anybody have any idea do help me out with this
thanks in advance

Just use std::vector
#include <iostream>
#include <vector>
int main(){
long n;
cin>>n;
std::vector<arr> a(n);
}

what approach should i try to declare more than 100000 objects?
is to declare them on the heap not on the stack, you're declaring 1000000 object on the stack, that's toooooo much. Be careful, there's stack limit:
C/C++ maximum stack size of program
why is stack memory size so limited?
to declare them on the heap, use:
arr *arr= new arr[n];
that's the same mechanism the vector uses to initialize elements.

Here is some background information on why allocating too large objects on the stack fails with a segmentation fault, while the same amount of space may be allocated on the stack in many small chunks:
The mechanic is, that the system sets up the stack with a small amount of memory allocated. This allocation is guarded by an inaccessible memory region. Whenever the program accesses this inaccessible region, the hardware alarms the kernel; the kernel analyses the situation, deduces that the program needs more stack memory, grows the stack by a few memory pages, and tells the hardware to normally continue executing the program.
This works perfectly as long the accessed address is close enough to the top of the stack, so that the kernel may safely assume that the application indeed wanted to draw up a new stack frame, instead of accessing some random memory location due to a bug. In your case, the limit seems to be around 16 MiB.
You should be able to use much more than 16 MiB of memory, if none of your stack frames is larger than 16 MiB, because then the limit will be raised in many small step, and the system won't think that you just accessed a random location.

Related

Segmentation fault in below program

I was trying to solve very basic problem SPOJ CANDY
I am getting a segmentation fault when submitting the below solution.
But in Visual Studio its working fine.
I also declared variables by considering the size (sum as long long int)
because it can be large
1) Is it due to the fact that I am declaring the array inside the while loop;
should I declare that array outside of while loop so that for every test cases it uses that same array
2) Is every time loop runs(for every test cases) the new array is created, will it lead to garbage collection or compiler will automatically free the memory after every test cases (I know about dynamic memory allocation in that case we have to free memory explicitly ) can you tell me in which scope I
should declare the variables?
I got above doubts because segmentation fault is regarding memory access.
#include<iostream>
using namespace std;
int main(){
while(1){
int n;
int arr[10001];
cin>>n;
if(n==-1)
break;
long long int sum=0;
for(int i=0;i<n;i++){
int temp;
cin>>temp;
sum+=temp;
arr[i]=temp;
}
int mean=sum/n;
if((sum%n)!=0){
cout<<-1<<endl;
continue;
}
int count1=0;
for(int i=0;i<n;i++){
if(arr[i]>mean){
count1+=(arr[i]-mean);
}
}
cout<<count1<<endl;
}
}
Your problem is probably due to the stack allocation of int arr[10001]. This is most probably a 40kB allocation. Now, "allocation" is the wrong word, as it essentially just calculates the address of arr by doing something like int * arr = STACK_POINTER-40004.
Unfortunately, it is common to have the maximum stack size be 12 kB by default. This means that the operating system maps 12 kB into memory and sets STACK_POINTER to the top of that memory (assuming the stack grows downward).
So the net effect is that your arr pointer now points beyond the allocated stack -- into unallocated memory -- and the first access throws a segmentation fault. Normally you could fix this by upping the stack size with ulimit -s, but you do not have control over the judging platform used.
You have two options:
use a heap allocation instead int *arr = new int[10001]. This is not affected by the initial stack size. In a normal program you should take care to clean this up, but for a short program like this it is not necessary.
move the declaration of int arr[10001] to the top level. arr will point to a region known as the BSS section, which is initially zeroed. This is also not affected by the initial stack size.

Structures crashing

I have a question about structures, as I was programming some code and after I added a structure the program always crash, so I isolated a small part of it and I found that struct STUDENT is the cause of it.
when I declare the array of STUDENT students [MAX] in main the program crashes and I have no idea why, the program only runs when I
1.Change subjects2 in STUDENT to not a array. However I do need to store more than one subject that belongs to a student
2.Declare STUDENT in main not as a array. I need an array as I have to store possibly a large amount of students.
Is there possibly somewhere wrong with my declaration? I kindly ask for some help, thank you in advance.
#include<iostream>
using namespace std;
const int MAX = 100;
enum Grade {HDist, Dist, Credit, Pass, Fail};
struct assessment_Task
{
char Title_Name[MAX];
int Weight;
int Mark;
double A_Mark;
};
struct SUBJECT
{
char subject_Code[MAX];
char subject_Title[MAX];
int No_Assess_Task;
assessment_Task AT [MAX];
int finalMark;
Grade grade;
};
struct STUDENT
{
char Name[MAX];
int ID;
char Subjects_Taken[2][50];
SUBJECT subjects2 [MAX];
};
int main()
{
STUDENT students[MAX];
}
As John3136 has answered, you're likely blowing up the stack that caused the crash.
I believe you can't well utilize that much elements in each array, and since you're using C++, I recommend using STL containers that helps you manage memory better.
First, replace all your char[] arrays to std::string. It has many handy features in addition to memory management. Even if you really need a C-style string, you can call str.c_str() to have one.
Then, replace all those arrays with std::vector. This is the well-known dynamic array container in C++. It uses dynamic memory allocation to arrange the array and won't blow up the stack like your current code is doing.
Doing some rough calculations based on 32 bits ints, no padding etc - basically some simple assumptions
assessment_Task is about 112 bytes
Subject is about 11412 bytes
Student is about 1141404 bytes (over 1 meg)
Therefore your of 100 students is over 100 meg
So that's over 100 Meg you are trying to put on the stack. Depending on the OS, your stack size is probably up to 8 meg or so (without special compiler optiions to set it). To confirm, try making the array of students a smaller size (e.g. 1)
Basically you need to reduce MAX or use a different MAX for different parts to reduce the size. Obviously the best solution is using vectors or similar and dynamically allocating the structs.
See Is there a limit of stack size of a process in linux and C/C++ maximum stack size of program for more info on stack size.

XCode: Stack Size Limit on Multidimensional Array

i have some complex classes in my xcode project (below a generic example)
and it seems I have hit some sort of data size limit.
the array sizes I need do not work, if I reduze the array sizes the code works (so no programming errors), but it is too small for what I planned.
reading through the internet I figured out it must be a problem with stack size and most of the solutions say "convert your static arrays to dynamic arrays".
but (1) that is not that easy with multidimensional arrays (some up to 5 to 10 dimensions as they monitor multiple independent variables and each combination is possible)
and (2) are most of the arrays nested in several classes, making it even worse.
I thought already of reducing the data
int instead of long with some intelligent transposition...
change resolution of c (0-100%) into steps of 10% (so [100] reduces to [10])
but on one hand this might jeopardize the overall results and on the other is the project still at the start so it will grow in the next month... this array size problem will come back sooner or later...
here I generalized the code showing a 4 dimensional array (2x 2D).
I guess most professional programs use arrays that are even bigger.
so there must be a way to make this works...
//.h
class StatisticTable
{
public:
long Array1 [100][50];
long Array2 [100][50];
long Array3 [100][140];
};
class Statistic
{
public:
void WriteStatistic(short Parameter_a, short Parameter_b,
short Parameter_c, short Parameter_d);
short ReadStatistic(short Parameter_a, short Parameter_b,
short Parameter_c, short Parameter_d);
private:
StatisticTable Table[16][8];
};
//.cpp
void WriteStatistic(short a, short b, short c, short d)
{
for (int i=0; i<d, i++) {Table[a][b].Array1[c][i]++;}
for (int i=d; i<50, i++) {Table[a][b].Array2[c][i]++;}
//write some more stuff
return;
}
Can you use heap allocation instead of stack allocation?
As suggested, using std::unique_ptr:
auto const ptr = std::unique_ptr<StatisticTable>(new StatisticTable()).get(); // heap allocated and deleted automatically when obj goes out of scope
I.e.
auto obj = new StatisticTable(); // heap allocation, allocate reference to new StatisticTable object on heap
// code
delete obj; // release heap allocated object
vs.
auto x = StatisticTable() // stack allocation

Segmentation fault during the initialization of array

I have seen segmentation fault sometimes during the initialization of an array with huge size.
For ex:
#include<iostream>
#include<limits>
using namespace std;
int main()
{
string h;
cin >> h;
int size=h.size();
cout << size << endl;
int arr[size][size];
cout << arr[0][0]<<endl;
arr[0][0]=1;
cout << arr[0][0]<<endl;
return 0;
}
When the user input is a small string lets say "sample" the program is working fine.
When the user input is a big string where the size is for ex. >1500.Segmentation is seen during the initialization of the array int arr[size][size];
What can be the issue?Is there any problem in initializating the array like the one above.
I think you are out of memory with those initializations, causing a stack overflow. I recommend to allocate it on the heap or by using a std:vector. See here: Segmentation fault on large array sizes
I think an array's size must always be a compile-time constant in C++ i.e. the value of your 'size' variable must be known at compile time.
If you want dynamic storage, use std::vector
MSDN states that the default stack size on Windows is 1 MB - in case of 1500 elements in each dimension your array would take up 1500 * 1500 * 4 bytes = 9000000 bytes = 8.58 megabytes, not sure about Linux (this states it to be 8 MB) - I guess it depends on the compiler and distributive. So either:
1) If you know that there is a limit for the string length increase the stack size accordingly with the /STACK linker flag on Windows or like posted in this answer on Linux
2) Allocate the array on heap - if you don't want to mess around with memory allocations std::vector or std::unique_ptr can be used as a container

C++ StackOverflowException initializing struct over 63992

"Process is terminated due to StackOverflowException" is the error I receive when I run the code below. If I change 63993 to 63992 or smaller there are no errors. I would like to initialize the structure to 100,000 or larger.
#include <Windows.h>
#include <vector>
using namespace std;
struct Point
{
double x;
double y;
};
int main()
{
Point dxF4struct[63993]; // if < 63992, runs fine, over, stack overflow
Point dxF4point;
vector<Point> dxF4storage;
for (int i = 0; i < 1000; i++) {
dxF4point.x = i; // arbitrary values
dxF4point.y = i;
dxF4storage.push_back(dxF4point);
}
for (int i = 0; i < dxF4storage.size(); i++) {
dxF4struct[i].x = dxF4storage.at(i).x;
dxF4struct[i].y = dxF4storage.at(i).y;
}
Sleep(2000);
return 0;
}
You are simply running out of stackspace - it's not infinite, so you have to take care not to run out.
Three obvious choices:
Use std::vector<Point>
Use a global variable.
Use dynamic allocation - e.g. Point *dxF4struct = new Point[64000]. Don't forget to call delete [] dxF4struct; at the end.
I listed the above in order that I think is preferable.
[Technically, before someone else points that out, yes, you can increase the stack, but that's really just moving the problem up a level somewhere else, and if you keep going at it and putting large structures on the stack, you will run out of stack eventually no matter how large you make the stack]
Increase the stack size. On Linux, you can use ulimit to query and set the stack size. On Windows, the stack size is part of the executable and can be set during compilation.
If you do not want to change the stack size, allocate the array on the heap using the new operator.
Well, you're getting a stack overflow, so the allocated stack is too small for this much data. You could probably tell your compiler to allocate more space for your executable, though just allocating it on the heap (std::vector, you're already using it) is what I would recommend.
Point dxF4struct[63993]; // if < 63992, runs fine, over, stack overflow
That line, you're allocating all your Point structs on the stack. I'm not sure the exact memory size of the stack but the default is around 1Mb. Since your struct is 16Bytes, and you're allocating 63393, you have 16bytes * 63393 > 1Mb, which causes a stackoverflow (funny posting aboot a stackoverflow on stack overflow...).
So you can either tell your environment to allocate more stack space, or allocate the object on the heap.
If you allocate your Point array on the heap, you should be able to allocate 100,000 easily (assuming this isn't running on some embedded proc with less than 1Mb of memory)
Point *dxF4struct = new Point[63993];
As a commenter wrote, it's important to know that if you "new" memory on the heap, it's your responsibility to "delete" the memory. Since this uses array new[], you need to use the corresponding array delete[] operator. Modern C++ has a smart pointer which will help with managing the lifetime of the array.