I answered this question in another question: Downloading large folder from google drive
I wrote a Python code using the PyDrive library, that can recursively retrieve the sub-folders and files inside a parent folder. Using the ids of the files, I then generated a bash script using wget.
Step 1
I have used the PyDrive library. To use this library, you have to complete the instructions described in this link.
Step 2
Now, create a python script or notebook in the same working directory, where you have saved the “client_secrets.json” file. I have attached the notebook below.
https://gist.github.com/immuntasir/73b8e8eef7e6c9066aaf2432bebf7db0
Step 3
Using scp, copy the “script.sh” to the remote server.
scp ~/path/script.sh username@ip:path
Step 4
Login to the remove server, navigate to the path. Then make the script executable using the following command.
chmod 777 script.sh
Run the script and voila!
./script.sh
I also wrote a tutorial, which can be found here: https://medium.com/@immuntasir/recursively-download-all-the-contents-of-a-google-drive-folder-using-python-wget-and-a-bash-script-d8f2c6b105d5
Hope this helps!