0

Sorry, maybe I didn't phrase it well enough. By export I didn't mean move it to an external device, but to export directory and file names recursively in a text file. I would need the entire tree structure of one huge directory. More precisely, I need to delete every back up file in this huge directory. That's why I need to export every directory and file in a txt file, so I can search by "back up" and be able to delete them easier than manually searching in every directory.

Edd
  • 3
  • What is your understanding of "export"? – Romeo Ninov Jan 06 '20 at 14:13
  • Please update your question to a meaningfull question instead of referring to the title of your question. – Lambert Jan 06 '20 at 14:19
  • Sorry, maybe I didn't phrase it well enough. By export I didn't mean move it to an external device, but to export directory and file names recursively in a text file. I would need the entire tree structure of one huge directory. More precisely, I need to delete every back up file in this huge directory. That's why I need to export every directory and file in a txt file, so I can search by "back up" and be able to delete them easier than manually searching in every directory. – Edd Jan 06 '20 at 14:25
  • So do you mean you want to get a list of the names of all the files and directories? Or maybe a backup (tarball) of the content of the files and directories? It really isn't clear what it is you actually want. It might help if you included a sample session of you using this big list. – Chris Davies Jan 08 '20 at 20:49

1 Answers1

0

It would appear that the find command, which seems to be available on Solaris, should to the job.

Assuming you are in the directory you want to "catalogue", issuing the command

user@host$ find . > directory_list.txt

would recursively list all files and directories below the current directory and write the output to the file directory_list.txt.

The find command also has the ability to search by file name pattern and execute a command such as rm on the found matches, which will accomplish the automatic removal of backup files you want to achieve, as long as their filename adheres to a predictable pattern. This would help to avoid parsing the output of ls or find in any script you may want to write to automate the process, which in most cases is not a good idea. Examples for such uses can be found on this site (e.g. here), but notice that you may have to use -exec rm -f '{}' \; instead of -delete depending on your find version.

AdminBee
  • 22,803
  • Thank you man. It worked! – Edd Jan 06 '20 at 14:42
  • yeah, the problem is there's no pattern. there are "backup" files and also files containing a date, which is also a backup, but the ones containing a date again don't have the same pattern. some are written like this yyyy/mm/dd, others are yyyymmdd, others ddmmyyyy, or dd/mm/yyyy, so there's no way i can just find a pattern and delete them all at once. – Edd Jan 06 '20 at 14:57
  • Ok, that's unfortunate. If the number of possible name variations is limited, it is still possible to apply more than one search pattern criteria, but if there are too many possibilities, that kind of automation indeed is not possible. – AdminBee Jan 06 '20 at 15:05
  • What you can do with your full file list is to scan it with awk to produce a carefully selected sub-list. This is more flexible and manageable than using multiple patterns in find. You could build up several such sublists using different patterns in each awk version, or put several patterns in the one awk to construct one composite list. Then you can review each list quickly -- the purpose being to avoid deleting more than you intended. Finally, you just write a one-liner shell script to read names and rm each one. – Paul_Pedant Jan 06 '20 at 16:07