My users sometimes get alerts with a timestamp, (from another system, via email) but no hostname. I want to give them a function that can compare timestamps in my system that's designed like this:
/home/admin/assets/sagLogs/
This directory contains 705 directories named after each host. So they look like:
bes-t1sg0120, art-t1sg8479, mmo-t1sg0132, ..., ...
and each directory contains logfiles that are named upon creation:
06-10-19-13-57-46 05-29-19-08-25-50 05-24-19-16-52-02 05-22-19-16-42-52 05-22-19-10-33-06
Named as month-day-year-hour-minute-second. I want to be able to search for all of the logs, regardless of hostname, that have a timestamp within 5 minutes of the input time.
I've handled the format:
#Times[0] = DAY
#Times[1] = MONTH
#Times[2] = YEAR
#Times[3] = HOUR
#Times[4] = MINUTE
IFS=':' read -r -a Times <<< $1
stamp="${Times[1]}-${Times[0]}-${Times[2]}-${Times[3]}-${Times[4]}"
So the input comes in the form Day:Month:Year:Hour:Minute
; so if $1 is 06:05:19:12:30
(May 6th 2019 at 12:30PM) then $stamp = 05-06-19-12-30
.
So I want to find everything between:
05-06-19-12-25
and 05-06-19-12-30
which are located in one of those ~/assets/sagLogs/*
directories.
How can I efficiently grab the files which are within 5 minutes of my stamp? I tried to load every filename into an array while awk'ing out the last portion but that was totally wrong and the speed was bad. Let me know what you think. Bonus points if you can show me how to get the absolute filepath while we're at it!
ls -1 */{05-06-19-12-2[6-9],05-06-19-12-3[0-1]}
- but then I started struggling about how to programmatically (in a nice shell scripting way, not like using 300kb of Java ...) determine what pattern to use... and well... maybe the ls globbing will help you :) – ivanivan Jun 13 '19 at 20:13