1

I have a script which function like this for one file.

./script 0001g.log > output

for two or more files, like this

./script 0001g.log 0002g.log 0003g.log > output

The script take one special number from each input file and put it in one output file.

My question is I have 1000 input files, how can I do a loop to execute my script.

Jeff Schaller
  • 67,283
  • 35
  • 116
  • 255
alloppp
  • 433
  • you're going to have to express many more of the details of what you're trying to accomplish in hopes of getting any assistance. to start with an obvious one, do the "log" files already exist? and if so, why doesn't this solve all your problems?: ./script *.log > output – Theophrastus Jun 21 '16 at 00:14
  • @Theophrastus the only reason I see not to use your solution is that *.log may be expanded over the ARG_MAX limit causing an Argument list too long error. That being said, your approach is the best solution unless OP really faces this issue. – Jedi Jun 21 '16 at 00:44
  • 2
    if it's just ARG_MAX holding you back, then 'find' should solve your problems (or find combined with xargs). something akin to: find . -type f -name '*.log' -exec ./script {} ; – Theophrastus Jun 21 '16 at 00:50
  • 1
    In regards to: The script take one special number from each input file and put it in one output file, I have to wonder what your script actually does. From the current description it sounds like you could be using a grep one-liner instead of a script. – Wildcard Jun 21 '16 at 03:03

3 Answers3

0

if

./script 0001g.log 0002g.log 0003g.log > output

eq.

./script 0001g.log > output
./script 0002g.log >> output
./script 0003g.log >> output

then you can use loop or

`seq -f '%04gg.log' 10` | script > output
sam
  • 22,765
Bob
  • 1
0

If you like you can put the files into a directory

/opt/location/source
    /0001g.log
    /0002g.log
    /0003g.log

Then in your bash script you can try the following

#!/bin/bash

# store path to files
SOURCE="/opt/location/source/"

# loop through files
for FILE in `ls $SOURCE*g.log`; do
    # do what you want to specific file
    echo $FILE
done
0

You have a few possible solutions:

Simply

$ ./script *g.log >output

... and hope that *g.log doesn't expand to something that makes the command line too long. This is not very robust.

If your script doesn't depend on the number of files given to it, i.e., if output can just be appended to output for each input file, then this is another solution:

$ find ./ -type f -name "*g.log" | xargs ./script >output

A third solution would be to move the loop into the script itself:

for f in *g.log; do
  # old code using "$f" as file name
done

This does not have the problem with command line length restriction since it's in a script.

The invocation of the script would now be

$ ./script >output
Kusalananda
  • 333,661