1

Want to have a script to process screen prints as grep.
I can run it like: cat file.txt | my_script

Tried below script, it didn't print out anything.

#!/bin/bash
line=$@
echo $line 
Fisher
  • 163
  • 1
  • 2
  • 11
  • I don't quite understand what it is you want to do. Do you want to implement grep purely as a shell script? Your script, if you invoke it like you show, does output something: an empty line. – Kusalananda Jul 15 '21 at 08:48
  • How to receive previous command piping output? – Fisher Jul 15 '21 at 09:20
  • Depends on what you want. One can use read to read from stdin. If you want to use the data in another command, which reads from stdin, you can simply call it. – ibuprofen Jul 15 '21 at 09:43
  • In the script: cat /dev/stdin or grep -o '.' /dev/stdin. Do not forget to specify the path to the script: ./my_script – nezabudka Jul 15 '21 at 09:58
  • why? why not just use grep in your script? – cas Jul 15 '21 at 10:50
  • @cas I need to analyze grep result. – Fisher Jul 27 '21 at 09:44
  • @Fisher so pipe the output of grep into a shell function or redirect it into a while read loop. Better yet, why not use awk? Any kind of analysis you might want to do in shell is going to run much faster in awk and be many times easier to write the code for. perl would be good too. shell is pretty much the worst choice when it comes to processing text (it is very good at orchestrating the execution of other programs, though, that's precisely what shell is for). BTW, see Why is using a shell loop to process text considered bad practice? – cas Jul 27 '21 at 10:59
  • @cas Thanks for the suggestion. I want to grep pattern A and pattern B; when pattern A matches, get number x and check if x larger than N; when pattern B matches, get number y and check if y larger than M. Seems not easy to do this with awk, even with perl, it might be too long and not easy to maintain. Also the text I'm processing is not huge. For my usage shell is good enough. – Fisher Jul 27 '21 at 14:50
  • That sounds trivial to do in either awk or perl. e.g. in awk, something like awk '/A/ { x=$1; (if x > N) {do something}}; /B/ { y=$1; if (y > M) {do something else}}' (assuming that both x & y's values come from field 1 of the input). I suggest posting another question asking how to do what you want in awk or perl. The better you can describe what you want to do, with a representative sample of the input and desired output, the better answer you'll get (and it'll probably turn out to be a lot simpler & easier than you thought it would be). Partial code or pseudo-code is good, too. – cas Jul 27 '21 at 18:20
  • @cas Tried your method, but got error. echo "123M abc" | awk '/^[0-9]M/ { x=$1; (if x > 100) {print}}' awk: cmd. line:1: /^[0-9]M/ { x=$1; (if x > 100) {print}} awk: cmd. line:1: ^ syntax error awk: cmd. line:1: /^[0-9]*M/ { x=$1; (if x > 100) {print}} awk: cmd. line:1: ^ syntax error – Fisher Jul 27 '21 at 19:29
  • sorry, i made a typo with the first if. write it as if (x > 100), as in the /B/ example. – cas Jul 28 '21 at 03:23

2 Answers2

0

You are trying to read from command line arguments ($@) while you should be reading from stdin. Basically the pipe attaches the first command's stdout to the second's stdin. A simple way how to do what you wanted in bash would be to use the read built-in command, line by line as in the example.

#!/bin/bash
while read line
do
  echo $line
done

Of course you can do whatever you want instead of echo.

glemco
  • 39
  • 4
    A single cat as the body of the script would be safer, if you just want to pass the output through. Your code would remove flanking whitespace, multiple whitespace characters between words, and it would potentially also expand filename globbing patterns if these were fed into the script. Additionally, echo may interpret certain escape sequences, like \t and \n etc. – Kusalananda Jul 15 '21 at 09:54
0

Same idea as @glemco's answer but this version should be safe for special characters (excluding a NULL byte):

#!/bin/bash
while IFS= read -r line
do
  printf '%s\n' "$line"
done
  • IFS= is to prevent trimming the leading and trailing whitespace
  • -r is used to prevent backslash escapes to be processed
  • The quotes " around "$line" are to prevent glob expansion, and to prevent replacing whitespace sequences with a single space
user000001
  • 3,635