I'm trying to code something in C on my mac, and I ran into a Segmentation Fault: 11
. I've located the problem to be the declaration of a double
array of 5 million elements. For instance, the following code gives a Segmentation Fault
:
int main(){
double vals[5000000];
return 0;
}
My first question is, is 8*5000000 bytes
= 40 MB
too large? I also tried to run this on another machine (linux), which ran smoothly. So the second question is, what determines the memory available to the application? Does it have to do with the available RAM on the machine (my mac has 16 GB
, the linux machine has 62 GB
)? Or does it have something to do with the compiler options (I'm using gcc
without any options on both machines, but different versions).
Edit: Okay, so I've changed the test code to the following because in the actual code the variable is not unused:
#include <stdlib.h>
#include <stdio.h>
int main(){
double vals[5000000];
vals[0] = 500;
printf("%lf
",vals[0]);
return 0;
}
Also, I compile without any options/optimizing: gcc test.c
.
(I'm also wondering if the downvoter actually realized I'm asking more than "oh, why am I getting a segmentation fault?" like all the other questions out there.)
See Question&Answers more detail:
os 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…