I have written a very simple script to make backup of my files and databases daily. This is my script:
mongodump --db DB
name=$( date +%F_%H_%M_%S )
tar zvcf /backup/$name-DB-db.tar.gz dump/
rm -rf dump
name=$( date +%F_%H_%M_%S )
tar zvcf /backup/$name-FILES-files.tar.gz /home/NAME/
It makes two tar.gz
files each day, and after a week there will be 14 tar.gz files.
What I want is, suppose the below output of ls -lh
:
2021-06-19_16_02_00-FILES-db.tar.gz
2021-06-19_17_02_00-FILES-db.tar.gz
2021-06-19_16_02_00-FILES-files.tar.gz
2021-06-19_17_02_00-FILES-files.tar.gz
2021-06-19_16_02_05-DB-db.tar.gz
2021-06-19_17_02_05-DB-db.tar.gz
2021-06-19_16_02_08-DB-files.tar.gz
2021-06-19_17_02_08-DB-files.tar.gz
What I want is:
Check if more than file {*FILES-db.tar.gz} exists, then remove the old ones and keep the last one. Else, skip.
Check if more than file {*DB-db.tar.gz} exists, then remove the old ones and keep the last one. Else, skip.
And the result of ls -lh
after that should be something like this:
2021-06-19_17_02_00-FILES-db.tar.gz
2021-06-19_17_02_00-FILES-files.tar.gz
2021-06-19_17_02_05-DB-db.tar.gz
2021-06-19_17_02_08-DB-files.tar.gz
What function or commands should I use in my bash script?
I think it's better to use this function at the end of my bash script after making backup is done, but if there are better approaches and solutions, I'm eager to hear.
Update 1
/etc/logrotate.d/NAME-files
:
/backup/*NAME-files.tar.gz {
daily
missingok
rotate 1
notifempty
}
zsh
where you can use glob qualifiers to select a range of files by modification time - see for example remove oldest files – steeldriver Jun 19 '21 at 14:47nginx
logrorate file and one of my files is as I have edited the question. I ran the script again but it did not remove anything. I ran alsologrorate /etc/logrotate.d/NAME-files
but nothing was removed:( – Saeed Jun 19 '21 at 19:16logrotate
is a better idea sincezsh
is not installed and I should install (I know it's very simple but I'll consider this way as plan B) – Saeed Jun 19 '21 at 19:17