Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
548 views
in Technique[技术] by (71.8m points)

perl - Why does Git.pm on cygwin complain about 'Out of memory during "large" request?

I'm getting this error while doing a git svn rebase in cygwin

Out of memory during "large" request for 268439552 bytes, total sbrk() is 140652544 bytes at /usr/lib/perl5/site_perl/Git.pm line 898, <GEN1> line 3.

268439552 is 256MB. Cygwin's maxium memory size is set to 1024MB so I'm guessing that it has a different maximum memory size for perl?

How can I increase the maximum memory size that perl programs can use?

update: This is where the error occurs (in Git.pm):

 while (1) {
      my $bytesLeft = $size - $bytesRead;
      last unless $bytesLeft;

      my $bytesToRead = $bytesLeft < 1024 ? $bytesLeft : 1024;
      my $read = read($in, $blob, $bytesToRead, $bytesRead); //line 898
      unless (defined($read)) {
         $self->_close_cat_blob();
         throw Error::Simple("in pipe went bad");
      }

      $bytesRead += $read;
   }

I've added a print before line 898 to print out $bytesToRead and $bytesRead and the result was 1024 for $bytesToRead, and 134220800 for $bytesRead, so it's reading 1024 bytes at a time and it has already read 128MB. Perl's 'read' function must be out of memory and is trying to request for double it's memory size...is there a way to specify how much memory to request? or is that implementation dependent?

UPDATE2: While testing memory allocation in cygwin: This C program's output was 1536MB

int main() {
   unsigned int bit=0x40000000, sum=0;
   char *x;

   while (bit > 4096) {
      x = malloc(bit);
      if (x)
         sum += bit;
      bit >>= 1;
   }
   printf("%08x bytes (%.1fMb)
", sum, sum/1024.0/1024.0);
   return 0;
}

While this perl program crashed if the file size is greater than 384MB (but succeeded if the file size was less).

open(F, "<400") or die("can't read
");
$size = -s "400";

$read = read(F, $s, $size);

The error is similar

Out of memory during "large" request for 536875008 bytes, total sbrk() is 217088 bytes at mem.pl line 6.
See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

This is a problem that has been solved in the latest version of msysgit by Gregor Uhlenheuer. There is a patch available. The problem is that in Git.pm, the file is read in one go. The solution is to read it in small chunks. I'm not sure if the fix has made it into any released versions, but the fix is easy to apply locally.

You need to change C:Program FilesGitlibperl5site_perlGit.pm (about 8 lines change). Make sure you back it up first.

For the details of what to do, see Git.pm: Use stream-like writing in cat_blob().

The original discussion is Problems with larger files "Out of memory".


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

1.4m articles

1.4m replys

5 comments

56.9k users

...