c++ read a multidimensional array as an integer only for comparison - c++

UPDATE: Considerations: this is on an embedded platform, I am memory limited and not wanting to have to use any extra Libs to perfom things like hashing..
I am running a program that uses a multi dimensional array.
uint8_t array[8][32]
This array is constantly updating, I need to run a comparison to determine if in the last 3 steps, there is a repetition.
It occoured to me that the array is just memory, so rather than iterate through the array, and create a comparison temp array (memory and processor wasted) why not approach this almost like making a hash of the array. I dont need any unique crypto so why not just read the memory location of the whole array as an integer?
Thoughts? How would I go about addressing the array to read in an integer which I could store in another simple array for my much easier comparison
unsigned long arrayTemp[3]

Related

Point to 3dimensional array in c++

I have searched on google but only found answers for single-dimension arrays.
I have a 3 dimension array storing data to be later processed by a function.
My array looks like this : levelcode[400][20][100]. It stores all the info that the decode level function needs. I get an stack overflow error immediately.
But how can i point to the entire array to get the values of it ( or how do i pass down the entire array? ) ?
I know i can technically call the function for each existing parameter but i think it would be faster and it would look better if the entire array was passed down or being used using a pointer of some sort.
How can i accomplish this?
I suggest you use a std::vector. It is basically a self managed grow-able array. It stores the data dynamically(heap) so you will be using the full system memory instead of the small bit of memory the program is given for automatic objects(stack). With your levelcode[400][20][100] you have 800,000 elements. if the array is of type int then you would more than likely need 3.2MB of space for the array. typically this is larger than than the space provided to the program and will cause a stack overflow
I would suggest you use single dimension vector and then you can use math to fake the 3 dimensions. This will make the data more cache friendly as multi-dimensional vectors do not have to have each dimension located right next to each other like a multi-dimensional array does.
So instead of having a
std::vector<std::vector<std::vector<some_type>>> name{DIM1, vector<vector<some_type>>{DIM2, vector<some_type>{DIM3}}};
and using it like
name[x][y][z]
We could have a
std::vector<some_type> name{DIM1 * DIM2 * DIM3};
and then you can access the elements with
name[x*DIM2*DIM3 + y*DIM3 + z]

Is dynamic memory deletion also possible in arrays?

Supposingly I've declared a character array and take a string from the user as follows:
char s[100000];
std::cin>>s;
Now say the user has entered the string "Program". My character array will be as as follows:
'P''r''o''g''r''a''m''\0'......(99992 remaining indices with no/junk values)
Is there a way to free the space occupied those 99992 indices? Similarly if I've an integer array of size say 100000 and I'm using only first 10 indices during run time, is there a way to resize my array during the run time of my program. I know we can use vectors for this purpose but is the thing possible somehow using arrays? For integer arrays, I know we may declare arrays dynamically and then declare size as per our requirement but say I have array of 10 integers as follows:
1 2 3 4 5 6 7 8 9 10
Now, I want to use only first 9 indices and wnat to kind of delete the 10th index. In other words, along with dynamic allocation, is dynamic deletion also possible with arrays?
EDIT:
I know the thing is possible using STLs but I want to know if we can do the same thing in arrays?
No.
If you have arrays defined with a fixed size, you cannot release part of those arrays at run-time. Use a dynamically allocated array of some sort — probably a string or vector<int> for your two example arrays respectively, though a vector<char> might also work sufficiently well for you.
When you write:
char s[100000];
You are telling the compiler to stack 100000 bytes in the program stack.
However when you reserve memory dynamically:
char * = new char[100000];
You are asking the system to reserve 100000 bytes in the heap so you can handle that asked memory as you want, even tell the system to free it as a resource.
You can't free memory at the stack until your local context is finished. For example, exiting the function you where you declared char s[100000].
Check this question:
What and where are the stack and heap?
std::string is implemented using dynamic memory allocation at the heap and that is why it allows you to reduce its size.
that is not possible.
you may wrap your user input capturing in a subroutine that allocates stack space and allocates heap memory at the actual required length.
You are confused over when to use static allocation and when to use dynamic allocation.
Static allocation is used when the maximum number of items is known in advance, at compile-time.
Dynamic allocation is used when the number of items is unknown until run-time.
There exists no other case than the two above. You cannot mix them and it wouldn't make sense to do so.
The only case where you should allocate a static array char s[100000]; is the case where you know, at some point, that there will be 100000 items that the program needs to handle.
You designed your program to handle the worst case of 100000 items. It must still be able to handle that many. If the program needs to have an array of variable, unknown size, you should have used dynamic allocation.
If we ignore that C++ exists, then what you would have done in C is this:
char* s = malloc(sizeof(*s) * 100000);
...
s = realloc(s, some_strlenght);
Please note that huge static arrays allocated on the stack is bad practice in many operative systems. So you might have to declare the 100000 array on the heap anyway, even though you won't resize it. Simply because there is likely not enough stack space in your process to declare large, bulky variables like that.
(Also, because of the way C++ is designed, std::string and std::vector etc are always implemented with dynamic memory internally, even if you only use them with one fixed size.)

Dynamical initialization of memory at a given memory address

Ok this might seem odd but please bear with me, I'm just a beginner. Over the past few days i have been trying to develop a general purpose hash function for maintaining an associative array with a hash table using all the best parts of hash functions like RS ,JS , ELF e.t.c to reduce hash collisions. but now the problem is even now to avoid a appreciable amount of collision i will have to use unsigned long values with atleast 6 digits to avoid collision.
Lets just assume i just need to map names of students to their marks.So i maintain an integer array for the marks.
Now back to my question.
The idea i thought of was to use these values as few lower order bits of of an actual memory address and then dynamically initialize memory large enough to store a integer for the marks obtained. This process is repeated for each new value added.
Now assuming i somehow managed to avoid all memory locations that would be reserved by the OS
Is there any viable way to dynamically initialize memory at an address we like rather than letting the new operator to initialize it and then return a pointer to that address location in C++. (i'm using gcc).
It is platform-dependant. On Unix systems, you might try using mmap(). The Windows equivalent is VirtualAlloc(). But there is no guarantee since the address might already be in use.

C++: Dynamically growing 2d array

I have the following situation solved with a vector, but one of my older colleagues told me in a discussion that it would be much faster with an array.
I calculate lots (and I mean lots!) of 12-dimensional vectors from lots of audio files and have to store them for processing. I really need all those vectors before I can start my calculation. Anyhow, I can not predict how many audios, and I can not predict how many vectors are extracted from each audio. Therefor I need a structure to hold the vectors dynamically.
Therefor I create a new double array for each vector and push it to a vector.
I now want to face and test, if my colleague is really right that the calculation can be boosted with using also an array instead of a vector for storing.
vector<double*>* Features = new vector<double*>();
double* feature = new double[12];
// adding elements
Features->push_back(features);
As far as i know to create dynamically 2d array I need to know the count of rows.
double* container = new double*[rows];
container[0] = new double[12];
// and so on..
I know rows after processing all audios, and I don't want to process the audio double times.
Anyone got any idea on how to solve this and append it, or is it just not possible in that way and I should use either vector or create own structure (which assumed may be slower than vector).
Unless have any strong reasons not to, I would suggest something like this:
std::vector<std::array<double, 12>> Features;
You get all the memory locality you could want, and all of the the automagic memory management you need.
You can certainly do this, but it would be much better if you perform this with std::vector. For dynamic growth of a 2D array, you would have to perform all these things.
Create a temporary 2D Array
Allocate memory to it.
Allocate memory to its each component array.
Copy data into its component arrays.
Delete each component array of the original 2D Array.
Delete the 2D Array.
Take new Input.
Add new item to the temporary 2D array.
Create the original 2D Array and allocate memory to it.
Allocate memory to its component arrays.
Copy temporary data into it again.
After doing this in each step, it is hardly acceptable that arrays would be any faster. Use std:vector. The above written answers explain that.
Using vector will make the problem easier because it makes growing the data automatic. Unfortunately due to how vectors grow, using vectors may not be the best solution because of the number of times required to grow for a large data set. On the other hand if you set the initial size of the vector quite large but only need a small number of 12 index arrays. You just wasted a large amount of memory. If there is someway to produce a guess of the size required you might use that guess value to dynamically allocate arrays or set the vector to that size initially.
If you are only going to calculate with the data once or twice, than maybe you should consider using map or list. These two structures for large arrays will create a memory structure that matches your exact needs, and bypass the extra time requirements for growing the arrays. On the other hand the calculations with these data structures will be slower.
I hope these thoughts add some alternative solutions to this discussion.

How to read an array in a binary file in C++

Currently I read arrays in C++ with ifstream, read and reinterpret_cast by making a loop on values. Is it possible to load for example an unsigned int array from a binary file in one time without making a loop ?
Thank you very much
Yes, simply pass the address of the first element of the array, and the size of the array in bytes:
// Allocate, for example, 47 ints
std::vector<int> numbers(47);
// Read in as many ints as 'numbers' has room for.
inFile.read(&numbers[0], numbers.size()*sizeof(numbers[0]));
Note: I almost never use raw arrays. If I need a sequence that looks like an array, I use std::vector. If you must use an array, the syntax is very similar.
The ability to read and write binary images is non-portable. You may not be able to re-read the data on another machine, or even on the same machine with a different compiler. But, you have that problem already, with the solution that you are using now.