3

When I pipe a large-ish output through unbuffer, all I end up getting is an endless stream of the bell character (u0007).

See here, a 1631-character string of arbitrary content piped through unbuffer at the end. All I get is the bell character.

echo "third messagethird messagethird messagethird messagethird messagethird messagethird message messagethird messagethird messagethird messagethird messagethird messagethird message messagethird messagethird messagethird messagethird messagethird messagethird message messagethird messagethird messagethird messagethird messagethird messagethird message messagethird messagethird messagethird messagethird messagethird messagethird message messagethird messagethird messagethird messagethird messagethird messagethird message messagethird messagethird messagethird messagethird messagethird messagethird message messagethird messagethird messagethird messagethird messagethird messagethird message messagethird messagethird messagethird messagethird messagethird messagethird message messagethird messagethird messagethird messagethird messagethird messagethird message messagethird messagethird messagethird messagethird messagethird messagethird message messagethird messagethird messagethird messagethird messagethird messagethird message messagethird messagethird messagethird messagethird messagethird messagethird message messagethird messagethird messagethird messagethird messagethird messagethird message messagethird messagethird messagethird messagethird messagethird messagethird message messagethird messagethird messagethird messagethird messagethird messagethird message messagethird messagethird messagethird messagethird messagethird messagethird message messagethird messagethird messagethird messagethird messagethird messagethird message messagethird messagethird messagethird messagethird messagethird messagethird" |  unbuffer -p sed 's/$/kangaroo/'

It seems 1023 characters is the limit I can pipe to unbuffer before it turns to all bell characters. How can I chunk messages before piping, if so? I guess I need to formulate some kind of loop reading 1023 bytes at a time as part of the pipeline but I'm not sure how.

Edit: I can do more than 1023 characters as long as there's a newline every 1023 characters, apparently. Doesn't help the situation, though.

  • Is it strings of 1023 characters, or lines of 1023 characters (plus newline)? Could you try again with a line break in the middle and [edit] the results in? – Michael Homer May 27 '19 at 01:12
  • The echo output is unpredictable, I can't control whether it has newlines or is just one output of 30,000 characters on one line. That being said, you have correctly observed that it is lines. It seems as long as there's a newline every 1023 characters, it has no problem. Why would that be? – temporary_user_name May 27 '19 at 01:23
  • 1
    I've looked at the unbuffer source code and I can't see why this should happen - it doesn't for me on this machine, but it does on a local macOS machine with identical unbuffer code. getconf LINE_MAX says 2048 on both, so it sounds like it shouldn't be that the line is too long to be considered part of a text file, but it could be an internal fixed-size buffer inside TCL or perhaps the TTY subsystem. I don't have a solution, but hopefully that helps someone track it down. – Michael Homer May 27 '19 at 03:35
  • Wow, thanks for going to that effort. Appreciate the insight! – temporary_user_name May 27 '19 at 03:37
  • 1
    See Why does unbuffer -p mangle its input?. I'd suggest closing this one as duplicate as the other one is more generic and covers more ground. – Stéphane Chazelas Oct 29 '23 at 20:29

0 Answers0