0

I have to download 20G of data using ftp .Can I do this in parallel(gnu?) Here are some of the links for the data.

 ftp://ftp.sra.ebi.ac.uk/vol1/fastq/ERR030/ERR030893/ERR030893.fastq.gz 
 ftp://ftp.sra.ebi.ac.uk/vol1/fastq/ERR030/ERR030885/ERR030885_1.fastq.gz
 ftp://ftp.sra.ebi.ac.uk/vol1/fastq/ERR030/ERR030885/ERR030885_2.fastq.gz
 ftp://ftp.sra.ebi.ac.uk/vol1/fastq/ERR030/ERR030894/ERR030894.fastq.gz
 ftp://ftp.sra.ebi.ac.uk/vol1/fastq/ERR030/ERR030886/ERR030886_1.fastq.gz
 ftp://ftp.sra.ebi.ac.uk/vol1/fastq/ERR030/ERR030886/ERR030886_2.fastq.gz

Thanks,

Braiam
  • 35,991
Ron
  • 1,057

1 Answers1

3

While I don't know about parallel, I do know that an excellent tool for downloading in parallel is aria2c.

Here's an excerpt from the FTP/HTTP section of its manual:

  • -s, --split=[N]
  • Download a file using N connections. If more than N URIs are given, first N URIs are used and remaining URIs are used for backup. If less than N URIs are given, those URIs are used more than once so that N connections total are made simultaneously. The number of connections to the same host is restricted by the --max-connection-per-server option. See also the --min-split-size option

  • Default: 5

mikeserv
  • 58,310