3

This question is a fork of this SO question.

Here is the MCVE version:

$ PS1='Parent-$ '
Parent-$ type seq
seq is /usr/bin/seq
Parent-$ bash
$ PS1='Child-$ '
Child-$ for i in $(seq 1000000000); do echo $i; done
bash: xrealloc: .././subst.c:5273: cannot allocate 18446744071562067968 bytes (4299235328 bytes allocated)
Parent-$ seq: write error: Broken pipe

(I have changed the PS1 of parent & child just to easily differentiate between them.)

Essentially, the child bash receives an out-of-memory error while processing the seq command with large number.

This issues is obviously because seq is calculated first & that output is used as input for the for loop.

My question is, however: Why did it not hit MAX_ARG_STRLEN limit? Or is this indeed hitting that limit? But if that is the case, the fault should not be a OOM in bash... Right?

One possible reason is: because bash first calculates $(...) & keep it in memory. After the evaluation is complete, then it forms the command line for actual command - for... part. But before the first step is completed, it receives OOM error.

Let me know if this understanding is right.

anishsane
  • 131
  • Afaik, the explanation in your penultimate paragraph is correct. MAX_ARG_STRLEN isn't checked until the argument list is actually formed, which as you can see is too late. – rici Jan 12 '15 at 07:29

1 Answers1

2

MAX_ARG_STRLEN only applies when calling out to external programs. "for" is part of bash's syntax, and so bash will be processing that directly. And yes, it will run the seq command and capture the entire stdout before performing the substitution into the "for" statement.

dataless
  • 1,719
  • /bin/echo $(seq 1000000000) >/dev/null also gave the same crash... – anishsane Jan 12 '15 at 06:11
  • I doubt bash checks MAX_ARG_STRLEN at all, that is more libc's job after bash calls exec(). But, in order for bash to call exec it would have to allocate the argument list first. And that runs you out of memory. – dataless Jan 12 '15 at 23:54
  • While it might be possible to enhance bash to stop reading after MAX_ARG_STRLEN and give up sooner, that might not be desired since sometimes those compile-time constants differ from the run-time limits, and its probably best to just do the thing the user asked for and let the operating system stop you if it isn't right. – dataless Jan 13 '15 at 00:00
  • btw, a quick way to fix your original script is '''seq 1000000000 | while read i; do echo $i; done''' – dataless Jan 13 '15 at 00:03
  • ^^ Yes, I am aware of that... while read i; do echo $i; done < <(seq 1000000000) is preferred way... – anishsane Jan 13 '15 at 05:34