Strange C++ 2D array initialization [duplicate] - c++

#include <iostream>
using namespace std;
int main() {
int rows = 10;
int cols = 9;
int opt[rows][cols] = {0};
for (int i = 0; i < rows; ++i) {
for (int j = 0; j < cols; ++j) {
std::cout << opt[i][j] << " ";
}
std::cout << "\n";
}
return 0;
}
Output:
0 32767 1887606704 10943 232234400 32767 1874154647 10943 -1
0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0
I'm using gcc 6.3, in https://www.codechef.com/ide
I'm expecting the first row to be all zeros. Shouldn't that be the case?
EDIT: I tested with const variables for rows and cols, and then it initialized to all zeroes. I feel this should throw a compile error instead of exhibiting this incorrect (and potentially dangerous) behavior.

If we look at the gcc 4.9 release notes it looks like they added support for initializating VLA with the expectation VLA would be supported in a future version of C++:
G++ supports C++1y variable length arrays. G++ has supported GNU/C99-style VLAs for a long time, but now additionally supports initializers and lambda capture by reference. In C++1y mode G++ will complain about VLA uses that are not permitted by the draft standard, such as forming a pointer to VLA type or applying sizeof to a VLA variable. Note that it now appears that VLAs will not be part of C++14, but will be part of a separate document and then perhaps C++17.
We can see it live that before 4.9 complains we can't initialize a VLA
error: variable-sized object 'opt' may not be initialized
int opt[rows][cols] = {0};
^
but in 4.9.1 and after it stops complaining and it does not have the same bug we see in more recent versions.
So it looks like a regression.
Note that clang refuses to allow initialization of a VLA (which they support as an extension) see a live example. Which make sense since C99 does not allow initialization of VLA:
The type of the entity to be initialized shall be an array of unknown size or an object type that is not a variable length array type.
gcc Bug 69517
gcc bug report :SEGV on a VLA with excess initializer elements has a comment that provides some background on this feature:
(In reply to Jakub Jelinek from comment #16)
The bug here is in G++ accepting a VLA initializer with more elements than there is room for in the VLA, and then trashing the stack at runtime with the extra elements. It is a regression with respect to GCC 4.9.3 which implements C++ VLAs as specified in n3639 (http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2013/n3639.html). This is documented in GCC 4.9 changes (https://gcc.gnu.org/gcc-4.9/changes.html) which highlights the feature using the following example:
void f(int n) {
int a[n] = { 1, 2, 3 }; // throws std::bad_array_length if n < 3
...
VLAs were subsequently removed from C++, and also partially (but not completely) removed from G++, which causes C++ programs developed and tested with G++ 4.9 to break when ported to a later version.
C++ VLAs will be safer to use with the patch referenced in comment #9. It patch had to be reverted from GCC 6.0 because it caused problems in Java. Java has been removed and I plan/hope to resubmit the patch for GCC 8. (I wanted to do it for GCC 7 but didn't get to it.)

This appears to be a GCC bug, and the desired behavior is most likely that this shouldn't compile. C99 supports variable-length arrays, but refuses to initialize them: C initializers need to know their type at compile-time, but the type of a variable-length array can't be complete at compile-time.
In GCC, C++ gets variable-length arrays as an extension from its C99 support. Therefore, the behavior governing variable-length array initialization in C++ isn't established by a standard. Clang refuses to initialize a variable-length array even in C++.
Note that even = {0} is technically sort of dangerous (if it worked at all): if rows and cols are 0, you'll be overflowing. Memset is probably your best option.

I posted this question to understand what's wrong with my code or gcc. But, this is how I would do it in C++. Use vectors instead of arrays for variable length array requirements.
#include <iostream>
#include <vector>
int main() {
int rows = 10;
int cols = 9;
std::vector<std::vector<int>> opt(rows, std::vector<int>(cols, 0));
for (int i = 0; i < rows; ++i) {
for (int j = 0; j < cols; ++j) {
std::cout << opt[i][j] << " ";
}
std::cout << "\n";
}
return 0;
}
Output:
0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0

Related

Why memset works to initialize 2D array in C++ with zero but failed to set when declared like this `array[n][k+1]={0}`? [duplicate]

#include <iostream>
using namespace std;
int main() {
int rows = 10;
int cols = 9;
int opt[rows][cols] = {0};
for (int i = 0; i < rows; ++i) {
for (int j = 0; j < cols; ++j) {
std::cout << opt[i][j] << " ";
}
std::cout << "\n";
}
return 0;
}
Output:
0 32767 1887606704 10943 232234400 32767 1874154647 10943 -1
0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0
I'm using gcc 6.3, in https://www.codechef.com/ide
I'm expecting the first row to be all zeros. Shouldn't that be the case?
EDIT: I tested with const variables for rows and cols, and then it initialized to all zeroes. I feel this should throw a compile error instead of exhibiting this incorrect (and potentially dangerous) behavior.
If we look at the gcc 4.9 release notes it looks like they added support for initializating VLA with the expectation VLA would be supported in a future version of C++:
G++ supports C++1y variable length arrays. G++ has supported GNU/C99-style VLAs for a long time, but now additionally supports initializers and lambda capture by reference. In C++1y mode G++ will complain about VLA uses that are not permitted by the draft standard, such as forming a pointer to VLA type or applying sizeof to a VLA variable. Note that it now appears that VLAs will not be part of C++14, but will be part of a separate document and then perhaps C++17.
We can see it live that before 4.9 complains we can't initialize a VLA
error: variable-sized object 'opt' may not be initialized
int opt[rows][cols] = {0};
^
but in 4.9.1 and after it stops complaining and it does not have the same bug we see in more recent versions.
So it looks like a regression.
Note that clang refuses to allow initialization of a VLA (which they support as an extension) see a live example. Which make sense since C99 does not allow initialization of VLA:
The type of the entity to be initialized shall be an array of unknown size or an object type that is not a variable length array type.
gcc Bug 69517
gcc bug report :SEGV on a VLA with excess initializer elements has a comment that provides some background on this feature:
(In reply to Jakub Jelinek from comment #16)
The bug here is in G++ accepting a VLA initializer with more elements than there is room for in the VLA, and then trashing the stack at runtime with the extra elements. It is a regression with respect to GCC 4.9.3 which implements C++ VLAs as specified in n3639 (http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2013/n3639.html). This is documented in GCC 4.9 changes (https://gcc.gnu.org/gcc-4.9/changes.html) which highlights the feature using the following example:
void f(int n) {
int a[n] = { 1, 2, 3 }; // throws std::bad_array_length if n < 3
...
VLAs were subsequently removed from C++, and also partially (but not completely) removed from G++, which causes C++ programs developed and tested with G++ 4.9 to break when ported to a later version.
C++ VLAs will be safer to use with the patch referenced in comment #9. It patch had to be reverted from GCC 6.0 because it caused problems in Java. Java has been removed and I plan/hope to resubmit the patch for GCC 8. (I wanted to do it for GCC 7 but didn't get to it.)
This appears to be a GCC bug, and the desired behavior is most likely that this shouldn't compile. C99 supports variable-length arrays, but refuses to initialize them: C initializers need to know their type at compile-time, but the type of a variable-length array can't be complete at compile-time.
In GCC, C++ gets variable-length arrays as an extension from its C99 support. Therefore, the behavior governing variable-length array initialization in C++ isn't established by a standard. Clang refuses to initialize a variable-length array even in C++.
Note that even = {0} is technically sort of dangerous (if it worked at all): if rows and cols are 0, you'll be overflowing. Memset is probably your best option.
I posted this question to understand what's wrong with my code or gcc. But, this is how I would do it in C++. Use vectors instead of arrays for variable length array requirements.
#include <iostream>
#include <vector>
int main() {
int rows = 10;
int cols = 9;
std::vector<std::vector<int>> opt(rows, std::vector<int>(cols, 0));
for (int i = 0; i < rows; ++i) {
for (int j = 0; j < cols; ++j) {
std::cout << opt[i][j] << " ";
}
std::cout << "\n";
}
return 0;
}
Output:
0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0

initializing 2d c-array with 0 not correctly in C++ [duplicate]

#include <iostream>
using namespace std;
int main() {
int rows = 10;
int cols = 9;
int opt[rows][cols] = {0};
for (int i = 0; i < rows; ++i) {
for (int j = 0; j < cols; ++j) {
std::cout << opt[i][j] << " ";
}
std::cout << "\n";
}
return 0;
}
Output:
0 32767 1887606704 10943 232234400 32767 1874154647 10943 -1
0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0
I'm using gcc 6.3, in https://www.codechef.com/ide
I'm expecting the first row to be all zeros. Shouldn't that be the case?
EDIT: I tested with const variables for rows and cols, and then it initialized to all zeroes. I feel this should throw a compile error instead of exhibiting this incorrect (and potentially dangerous) behavior.
If we look at the gcc 4.9 release notes it looks like they added support for initializating VLA with the expectation VLA would be supported in a future version of C++:
G++ supports C++1y variable length arrays. G++ has supported GNU/C99-style VLAs for a long time, but now additionally supports initializers and lambda capture by reference. In C++1y mode G++ will complain about VLA uses that are not permitted by the draft standard, such as forming a pointer to VLA type or applying sizeof to a VLA variable. Note that it now appears that VLAs will not be part of C++14, but will be part of a separate document and then perhaps C++17.
We can see it live that before 4.9 complains we can't initialize a VLA
error: variable-sized object 'opt' may not be initialized
int opt[rows][cols] = {0};
^
but in 4.9.1 and after it stops complaining and it does not have the same bug we see in more recent versions.
So it looks like a regression.
Note that clang refuses to allow initialization of a VLA (which they support as an extension) see a live example. Which make sense since C99 does not allow initialization of VLA:
The type of the entity to be initialized shall be an array of unknown size or an object type that is not a variable length array type.
gcc Bug 69517
gcc bug report :SEGV on a VLA with excess initializer elements has a comment that provides some background on this feature:
(In reply to Jakub Jelinek from comment #16)
The bug here is in G++ accepting a VLA initializer with more elements than there is room for in the VLA, and then trashing the stack at runtime with the extra elements. It is a regression with respect to GCC 4.9.3 which implements C++ VLAs as specified in n3639 (http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2013/n3639.html). This is documented in GCC 4.9 changes (https://gcc.gnu.org/gcc-4.9/changes.html) which highlights the feature using the following example:
void f(int n) {
int a[n] = { 1, 2, 3 }; // throws std::bad_array_length if n < 3
...
VLAs were subsequently removed from C++, and also partially (but not completely) removed from G++, which causes C++ programs developed and tested with G++ 4.9 to break when ported to a later version.
C++ VLAs will be safer to use with the patch referenced in comment #9. It patch had to be reverted from GCC 6.0 because it caused problems in Java. Java has been removed and I plan/hope to resubmit the patch for GCC 8. (I wanted to do it for GCC 7 but didn't get to it.)
This appears to be a GCC bug, and the desired behavior is most likely that this shouldn't compile. C99 supports variable-length arrays, but refuses to initialize them: C initializers need to know their type at compile-time, but the type of a variable-length array can't be complete at compile-time.
In GCC, C++ gets variable-length arrays as an extension from its C99 support. Therefore, the behavior governing variable-length array initialization in C++ isn't established by a standard. Clang refuses to initialize a variable-length array even in C++.
Note that even = {0} is technically sort of dangerous (if it worked at all): if rows and cols are 0, you'll be overflowing. Memset is probably your best option.
I posted this question to understand what's wrong with my code or gcc. But, this is how I would do it in C++. Use vectors instead of arrays for variable length array requirements.
#include <iostream>
#include <vector>
int main() {
int rows = 10;
int cols = 9;
std::vector<std::vector<int>> opt(rows, std::vector<int>(cols, 0));
for (int i = 0; i < rows; ++i) {
for (int j = 0; j < cols; ++j) {
std::cout << opt[i][j] << " ";
}
std::cout << "\n";
}
return 0;
}
Output:
0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0

Incorrect values when initializing a 2D array to 0 in gcc

#include <iostream>
using namespace std;
int main() {
int rows = 10;
int cols = 9;
int opt[rows][cols] = {0};
for (int i = 0; i < rows; ++i) {
for (int j = 0; j < cols; ++j) {
std::cout << opt[i][j] << " ";
}
std::cout << "\n";
}
return 0;
}
Output:
0 32767 1887606704 10943 232234400 32767 1874154647 10943 -1
0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0
I'm using gcc 6.3, in https://www.codechef.com/ide
I'm expecting the first row to be all zeros. Shouldn't that be the case?
EDIT: I tested with const variables for rows and cols, and then it initialized to all zeroes. I feel this should throw a compile error instead of exhibiting this incorrect (and potentially dangerous) behavior.
If we look at the gcc 4.9 release notes it looks like they added support for initializating VLA with the expectation VLA would be supported in a future version of C++:
G++ supports C++1y variable length arrays. G++ has supported GNU/C99-style VLAs for a long time, but now additionally supports initializers and lambda capture by reference. In C++1y mode G++ will complain about VLA uses that are not permitted by the draft standard, such as forming a pointer to VLA type or applying sizeof to a VLA variable. Note that it now appears that VLAs will not be part of C++14, but will be part of a separate document and then perhaps C++17.
We can see it live that before 4.9 complains we can't initialize a VLA
error: variable-sized object 'opt' may not be initialized
int opt[rows][cols] = {0};
^
but in 4.9.1 and after it stops complaining and it does not have the same bug we see in more recent versions.
So it looks like a regression.
Note that clang refuses to allow initialization of a VLA (which they support as an extension) see a live example. Which make sense since C99 does not allow initialization of VLA:
The type of the entity to be initialized shall be an array of unknown size or an object type that is not a variable length array type.
gcc Bug 69517
gcc bug report :SEGV on a VLA with excess initializer elements has a comment that provides some background on this feature:
(In reply to Jakub Jelinek from comment #16)
The bug here is in G++ accepting a VLA initializer with more elements than there is room for in the VLA, and then trashing the stack at runtime with the extra elements. It is a regression with respect to GCC 4.9.3 which implements C++ VLAs as specified in n3639 (http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2013/n3639.html). This is documented in GCC 4.9 changes (https://gcc.gnu.org/gcc-4.9/changes.html) which highlights the feature using the following example:
void f(int n) {
int a[n] = { 1, 2, 3 }; // throws std::bad_array_length if n < 3
...
VLAs were subsequently removed from C++, and also partially (but not completely) removed from G++, which causes C++ programs developed and tested with G++ 4.9 to break when ported to a later version.
C++ VLAs will be safer to use with the patch referenced in comment #9. It patch had to be reverted from GCC 6.0 because it caused problems in Java. Java has been removed and I plan/hope to resubmit the patch for GCC 8. (I wanted to do it for GCC 7 but didn't get to it.)
This appears to be a GCC bug, and the desired behavior is most likely that this shouldn't compile. C99 supports variable-length arrays, but refuses to initialize them: C initializers need to know their type at compile-time, but the type of a variable-length array can't be complete at compile-time.
In GCC, C++ gets variable-length arrays as an extension from its C99 support. Therefore, the behavior governing variable-length array initialization in C++ isn't established by a standard. Clang refuses to initialize a variable-length array even in C++.
Note that even = {0} is technically sort of dangerous (if it worked at all): if rows and cols are 0, you'll be overflowing. Memset is probably your best option.
I posted this question to understand what's wrong with my code or gcc. But, this is how I would do it in C++. Use vectors instead of arrays for variable length array requirements.
#include <iostream>
#include <vector>
int main() {
int rows = 10;
int cols = 9;
std::vector<std::vector<int>> opt(rows, std::vector<int>(cols, 0));
for (int i = 0; i < rows; ++i) {
for (int j = 0; j < cols; ++j) {
std::cout << opt[i][j] << " ";
}
std::cout << "\n";
}
return 0;
}
Output:
0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0

Strange numbers when array is not initialized in C++ [duplicate]

This question already has answers here:
What happens to a declared, uninitialized variable in C? Does it have a value?
(9 answers)
Closed 7 years ago.
EDIT: ^^^ "duplicate" doesn't mention arrays at all
EDIT2: Hold on that's in C, not C++, isn't there a difference between 2 languages ?!
This question has been bugging me for some time lately. Google search revealed nothing.
So I have this snippet of example C++ code:
int factors[100]; /* note this is not initialized */
int number = /* less than 100 */ 10;
for (int i = 0; i < number; i ++) {
factors[i] = 1;
}
for (int i = 0; i < 100; i ++) {
std::cout << factors[i] << std::endl;
}
The output is (scroll down to bottom)
1
1
1
1
1
1
1
1
1
1
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
1640775680
32767
114023525
624860211
174064279
236792104
-1027703263
587262357
1599638600
32767
17
0
1
0
6778984
1
1640935824
32767
1599638352
32767
1640780406
32767
1599638384
32767
1599638384
32767
1
0
1599638408
32767
6778880
1
1640776264
32767
1599638424
32767
0
0
0
0
0
0
0
0
0
0
Why isn't it either ten 1s or ten 1s and ninety 0s, and why are there so many seemingly random (maybe related to powers of 2?) numbers? I think it may have something to do with memory allocation or something but I'm just a beginner and I've not gotten into this stuff yet.
If you have the declaration
int factors[100]; /* note this is not initialized */
there are two situations:
When declared as a global (file scope) variable, the entire array will be initialised to zeros before your program starts.
When declared as a local (function scope) variable, the array is not initialised and will contain unpredictable numbers.
The uninitialized arrays are filled with garbage values.Garbage values are those values present in that specific memory location before the user requests for it.The memory location have always existed.In many cases the output is 0 as compiler explicitly writes defualt values before returning these locations.But this behaviour is not always exhibited by C/C++ compilers,hence the presence of a varied output.
Thats just the thing, if you don't initialize your arrays, C++ does not guarantee it will be blank

conway game error with 2d array manipulation

So I'm working on the life game, and so far I have come up with this http://ideone.com/QG4tsS I'm not sure exactly if I am on the right track or not. Basically I have a function putting out random values to try and test my code. But nothing seems to happen. I suspect my problem lies with the following code
int sum = 0;
for (int k = (i - 1); k <= (i + 1); k++) {
for (int l = (j - 1); l <= (j + 1); l++) {
sum += currentGen[k][l];
}
}
return sum;
So my result gives me a 2d array with all 0's but shouldn't I start to see some changes and patterns starting to form? I get one 1 and the rest are 0.
Output
0 0 0 0
0 0 0 0
0 0 0 0
0 0 0 0
0 0 0 0
0 0 0 0
0 0 0 1
0 0 0 0
0 0 0 0
0 0 0 0
0 0 0 0
0 0 0 0
0 0 0 0
0 0 0 0
0 0 0 0
0 0 0 0
0 0 0 0
0 0 0 0
I provide this answer based on the code you posted at http://ideone.com/QG4tsS . You really should consider adding that code to your original question, so that future folks who find this on StackOverflow have the full context.
Your RandomCells function only sets cells to 1 if they meet the RANDOM threshold. It doesn't clear them to 0 otherwise. Once you fix that, you'll be all set. ie.
void RandomCells(int currentGen[][CELLY]) {
for (int i = 0; i < CELLX; i++) {
for (int j = 0; j < CELLY; j++) {
if (rand() % 100 + 1 < RANDOM) {
currentGen[i][j] = 1;
} else
{
currentGen[i][j] = 0;
}
}
}
}
Without that else clause, I was seeing initial generations that looked like this:
0 0 4196155 1
1813657216 1 4197653 0
-870503576 1 4197584 1
Clearly, most of those cells were non-zero, and so Conway's Life algorithm would map them to 0 in the next generation because of "crowding".
The reason currentGen was filled with such 'random' values is that it was allocated as an automatic variable to main. Automatic variables do not get initialized to any particular value. You need to initialize them yourself. You can do that by modifying your algorithm (as I did above), or by adding an explicit bit of code to initialize the structure.
This differs from file-scope variables, which C and C++ define as initialized-to-zero on program start if they don't have initializers or default constructors. (Pedants will point out that even that has caveats.)
Once you make the required fixes, to truly see Conway's Life, you'll need to set CELLX and CELLY to larger values...