Hi, as we increasingly use google drive as a storage space for our files (as well as backups of our websites), for those using linux with console, they can use wget command with a little trick, in order to download files on their Virtual or Dedicated private server.
Files Smaller than 100ΜΒ: Just change FILEID (you can find it by checking your file’s share url) and the desired FILENAME
wget --no-check-certificate 'https://docs.google.com/uc?export=download&id=FILEID' -O FILENAME
Files bigger than 100MB: The same goes here (just change FILEID and FILENAME)
wget --load-cookies /tmp/cookies.txt "https://docs.google.com/uc?export=download&confirm=$(wget --quiet --save-cookies /tmp/cookies.txt --keep-session-cookies --no-check-certificate 'https://docs.google.com/uc?export=download&id=FILEID' -O- | sed -rn 's/.*confirm=([0-9A-Za-z_]+).*/\1\n/p')&id=FILEID" -O FILENAME && rm -rf /tmp/cookies.txt
Best Regards, Nicolas 😉