I am running Mac OS 10.11.6 El Capitan. There is a link I would like to download programmatically:
https://dev.mysql.com/get/Downloads/MySQL-5.7/mysql-5.7.16-osx10.11-x86_64.dmg
If I paste this URL into any browser (e.g. Safari) the download works perfectly.
However, if I try to download the same URL from the command line using curl
, it doesn't work—the result is an empty file:
$ ls -lA
$ curl -O https://dev.mysql.com/get/Downloads/MySQL-5.7/mysql-5.7.16-osx10.11-x86_64.dmg
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
$ ls -lA
total 0
-rw-r--r-- 1 myname staff 0 Nov 7 14:07 mysql-5.7.16-osx10.11-x86_64.dmg
$
Of course I can get the file through the browser, but I would like to understand why the curl
command above doesn't work.
Why can't curl
download this file correctly, when it is evidently present on the website and can be correctly accessed and downloaded through a graphical web browser?
For example, if the server has backend DDoS protection, such protection software commonly checks for viable browser headers, such as having a proper
– Joseph Nov 08 '16 at 10:15User-Agent
. Additionally, some browser downloads may succeed due to session cookies (i.e if you're logged in) being only present on said browser.curl -v
for "verbose". It will print to standard error various info about the connection, request, and response. In this case, you would see that the response includesHTTP 302 Found
(a redirect code) and aLocation
header with the URL to go to. Then you couldman curl
to find out how to tell it to follow redirects. – Nathan Long Nov 08 '16 at 17:01curl --header "Authorization: Basic 12345" -o indexexport.zip http://localhost:8080/indexexport
– Andrei Jul 20 '22 at 14:15