I normally use machine A, and I make backups of A onto a fileserver B. Sooner or later, I will lose machine A for one reason or another. Its hard drive wears out, or it gets hit by lightening, or some salesperson convinces me that it's an embarrassing piece of obsolete junk, or an overclocking experiment goes horribly wrong, or it suffers a "glitter-related event", etc.
Let's assume that computer C is totally different from computer A -- different mass storage interface, processor from a different company, different screen resolution, etc.
Is there an easy way to make a list of all the software currently installed on A before disaster strikes, in a way that makes it easy to install the same software on the blank hard drives of computer C? Or better yet, makes it easy to install the latest versions of each piece of software, and the specific sub-version optimized for this particular machine C?
If I have plenty of space on B, it seems easiest copy everything from A to B. If I do that, what is a good way of dividing the files I want to copy from B to C from the files I don't? I don't want to copy binary files I can easily re-download (and possibly re-compile) as needed, and probably wouldn't work on machine C anyway. Or is it better in the long run to try to avoid backing up such easily-obtained machine-specific binary files onto B in the first place? Is there a better way to reduce the chances that viruses and trojans get passed on to C and re-activated?
When I customize software or write fresh new software, what is a good way to make sure the tweaks I have made get backed up and transferred to the new machine and installed? Such as cron and anacron tasks?
What I can I do to make my transition to some new computer C safe and smooth?
(This question expands on a sub-question of "Incremental system backup and restore w/ rsync or rdiff-backup issues" that I thought was particularly important).