3

I have a command that outputs thousands of lines. I actually need these lines in order to see the progress of the command. But I do not need to see more than the 10 most recent lines.

How can I do that?

I already tried using a pipe and tail. But it did not work. I just got the last ten lines after the original command finished:

whatever command that has too much output | tail -f

It is important that the console does not get cleared or something like this because there is some important infomation that gets printed right before the command with the lengthy output.
An example:

echo "Very important information. MUST BE VISIBLE!"

# This gives me about 10,000 lines of output pretty fast!
# This output should be shrinked down to the most recent 10
tar -cvf "Bckup.tar" "folder to backup/"

# More code

I hope this clears it up.

EDIT:

The problem with multitail is that it takes up the entire screen. So if I had more than just one output (which I have. Multiple commands with important information are run before this. And I need to use it multiple times).

Something like running this in screen with just 10 lines to display would be perfect. (I know that does not work).

You could imagine that like there is some output from other commands. Then I call the command with the lengthy output. This output stays in a screen that only can display something like 10 lines. After the command is finished more output goes to the console and this should be right beneth it like normal output.

Output
Important Information
Other commands output
----------------------------------------
some lines for the tar command





----------------------------------------
More output
....
....

(Just without the lines)

Thomas
  • 105
BrainStone
  • 3,654
  • I don't quite understand. Doesn't your terminal scroll with the command output so you always see the most recent lines anyway? – slhck Aug 25 '13 at 19:54
  • @slhck Yes. that's true. But I don't see the lines that are in front of the command with the lengthy output. –  Aug 25 '13 at 19:58
  • What command or tool produces that "very important information"? Is it a literal echo or a more complicated command? – Joseph R. Aug 25 '13 at 23:12
  • how about whatever command that has too much output | tee file-with-thousands-of-lines | tail then less file-with-thousands-of-lines – Skaperen May 05 '15 at 10:01
  • You can do what was asked using split-screens with tmux or screen. None of the suggested answers actually addressed the question. – Thomas Dickey Oct 01 '16 at 01:34

5 Answers5

5

There is an escape sequence for ECMA-48 (vt100-ish) terminals that restricts scrolling to a subset of lines:

CSI top ; bottom r

Here's a demonstration

# Start with the screen clean and cursor on line 1
clear

# print the non-scrolling banner line
echo Hello world

# set up the scrolling region to lines 2 through 11 and position the
# cursor at line 2
echo -n '^[[2;11r^[[2H'

# Run a command that will produce more than 10 lines, with a slight delay
# so it doesn't happen so fast you can't see it scrolling
perl -MTime::HiRes=sleep -le 'for(1..100) { print; sleep 0.05; }'

# Restore normal scrolling and put cursor on line 12
echo -n '^[[r^[[12H'

Note: All instances of ^[ in the above must be actual escape characters in the script. In vi, for example, they are entered in inset mode with CtrlV followed by Esc.

  • This looks pretty awsome! I'm going to try this later. is it possible to do this without knowing how many lines have already been displayed? And how do I write them on somthing like a texteditor on windows? Just copy paste from this snippet? – BrainStone Aug 26 '13 at 02:04
  • Hence my suggestion to use curses+perl (to keep it as simple as possible AND to attempt to make it terminal unspecific - by utilizing curses to do the work of that...) NOTE, it may work on Wumpus's terminal, but may not work on yours - or if the command is ran from a console, or Gnome Terminal or via putty and so forth. – Drav Sloan Aug 26 '13 at 02:05
  • Just tested it! If you can get it to work without the knowledge of the number of the lines this would be awsome! – BrainStone Aug 26 '13 at 02:16
  • 1
    You can use echo -e and \e as well, which is supported in answers here. E.g. echo -en '\e[2;11r\e[2H' – Michael Mrozek Aug 26 '13 at 04:54
4

Multitail is a tail -f on steroids. Its abilities include splitting the screen and showing multiple files or commands.

If the important lines come from a file:

tar -cvf "Bckup.tar" "folder to backup/" | multitail important.txt -j

If the important lines are the output of a command:

tar -cvf "Bckup.tar" "folder to backup/" | multitail -l 'show-stuff --important' -j
  • This is pretty good. Not excatly what I want but better than every other answer. This doesn't reaaly workout since I want to use this multiple time and I still want to be able to see other stuff otputed before. (More than 1 line...) – BrainStone Aug 26 '13 at 01:55
2

If I understand correctly BrainStone is asking for a "scrolling window".

You can achieve that with Curses and perl:

#! /usr/bin/perl

use strict;
use warnings;
use Curses;

$SIG{INT} = sub { endwin; exit(0); };

initscr();

my $w = newwin(11, COLS(), 10, 1);
$w->scrollok(1);

my $a=0;
while(<>) {
    $a++ if($a < 10);
    $w->addstr($a, 0, $_);
    $w->refresh();
}
endwin;

This will create a "scrolling window" at line 10, 10 lines long which will display the output of a piped command. You will have to alter it to suit your needs, but the basic premise is there. You might want to handle all your shell commands inside perl, and have it all in one script. Or you could, for example, use the command arguments to pass the Very important information. MUST BE VISIBLE! string, or may be a file name that contains the important information. And display that before the scrolling window is created.

I've done a more in depth version on pastebin, which handles a few command options (such as size of scroll window, number of lines in the terminal to offset the window and a "header file" to display before the output). Details of the command options and some additional comments are provided.

Example

Using the code above as is, if you put it into a file named scroller.pl, just set the files permissions so it's executable:

$ chmod +x scroller.pl

You can then use it to show the output from a long running command like so:

$ seq 10000 | scroller.pl

                       ss of scroller.pl

This version will then exit when the output being piped into scroller.pl is complete, and return you to your normal shell setup.

Drav Sloan
  • 14,345
  • 4
  • 45
  • 43
  • I have basically no idea how to use perl. So this does not look like an option to me. And I'd also like to keep things simple. – BrainStone Aug 26 '13 at 01:52
  • You can place that code in a file, called say "scroller.pl", chmod it with chmod 750 scroller.pl and then you can do tar -zcvf mybackup.tar backupdir | scroller.pl - you don't necessarily have to learn perl to have to use it :) It's just another "tool" you can use in your scripts. – Drav Sloan Aug 26 '13 at 02:02
  • Still I'd like to keep it in one file. If nothing works I will use your solution. It's not that i don't like your solution, it just does not really fit my needs. – BrainStone Aug 26 '13 at 02:08
  • @BrainStone - how is this any more complicated than Wumpus' solution? This is much more portable and will work by dumping the code in a single file and making it excutable. It's about as easy as it gets... – slm Aug 26 '13 at 04:06
  • @DravSloan - you might want to show exactly where in the code to make the modifications. This is one of the stronger answers here, that would help to make it stand out. – slm Aug 26 '13 at 04:11
  • @slm I've pastebin'ed it as it ended up rather large after I started playing with it - edited answer to make reference to it :) – Drav Sloan Aug 26 '13 at 05:49
  • It is more complicated because it requires another file. This does simply not fit my needs. If I can't get Wompus's solution to work I will use this one. – BrainStone Aug 26 '13 at 10:27
  • There is another problem. I don't have perl on the machine. Since this is a remote serevr and I am just a normal user I cannot install it. – BrainStone Aug 27 '13 at 03:22
0

Here's a simple bash script you can use to monitor the output of your command (the command in this example is rpm -qa) line by line with a manually define sleep timer. It also appends your output to a temp file for reference following script completion. As for cycling through a 10 line buffer, I'm unable to think of a solution at this point.

#!/bin/bash

syntax="rpm -qa"
command=( $($syntax) )
echo "Using: $syntax"
for line in "${command[@]}"
 do
  echo "$line" >> /tmp/temp.file
  echo -ne "OUTPUT: $line\033[0K\r"
  sleep 0.2
done
echo


one.time
  • 111
  • I do not want to control the number of lines. I want it to work. I just get the output afer the original output finished. –  Aug 25 '13 at 19:43
  • I updated the answer to reflect your more specific clarification. You can tail stdout as reflected in my updated answer.

    Example:

    commands >> text.txt | tail -f -n 10 text.txt

    –  Aug 25 '13 at 20:01
  • I also had this idea but this won't work either! tail -f Will not terminate automatically –  Aug 25 '13 at 20:04
  • 3
    -1. There is no point in using -f if you cat since the output will never change, -f is for monitoring files/text streams. I guess you meant tail -f -n 35 /var/log/messages but that still shows new lines as they come in without deleting the previous ones. Also, commands >> text.txt | something_else makes no sense, you should use && or ; to separate commands, | is for piping to another command and will not work if you are sending the output of the left operand to a file. – terdon Aug 25 '13 at 20:07
  • @terdon. Thanks for pointing out my mistakes. I updated the answer to reflect the correct syntax after testing it, though this doesn't provide a solution to the question. –  Aug 25 '13 at 20:25
  • @terdon: It appears so! Thanks again for the constructive criticism. –  Aug 25 '13 at 21:56
  • @BrainStone: I updated my answer with a simple bash script that you can evaluate your output with line-by-line without garbled text. –  Aug 25 '13 at 22:12
0

If you always want to see just the last N lines, I don't think you can avoid using a temporary file. Something like this:

while true; do sleep 1; date +%s >> temp_file; done

Then, in another terminal, run

while true; do clear; tail -n 10 temp_file;sleep 1; done

That last command will i) clear the terminal, ii) print the last 10 lines of the temp file and iii) wait for one second. The result will be a continuously updating dump of the current last 10 lines of the temp file.

As far as I know, there is no way of doing this while keeping the original command in sight in a single terminal.


I can't seem to get this to work for multiple lines, but this works for single lines:

#!/bin/bash

echo "Very important information. MUST BE VISIBLE!"
while tar -cvf "Bckup.tar" lmde_backup/www/lycabettus/ >tmp_file 2>&1 & do 
    output=$(tail -n 1 aa); 
    echo -ne "$output\r";
done

It does result in artifacts when the current line is shorter than the last one so it really is not perfect but you might be able to improve on it.

terdon
  • 242,166
  • 1
    This isn't a good solution either. The command is called during a script so i can't just open another terminal or even clear th terminal. That would make the thing pointless since I do care what happend before the command. –  Aug 25 '13 at 20:21
  • @BrainStone please edit your question to explain your requirements. I am still not clear on why you can't just let tail -f scroll freely. Unless you specifically state what you need (clarify your answer to slhck's comment, maybe add an example) we cannot give you a satisfactory answer. Ifit is a script that is printing all these lines, you could do something clever with \r. – terdon Aug 25 '13 at 20:23
  • I updated my question and provided an example –  Aug 25 '13 at 20:41