How to Move Files From Server To Server With SSH & Wget
Downloading files off the internet can be a sweet experience if the right conditions are met such as hi-speed internet connection and download resume capability.
Unfortunately this isn't always so as the experience could be ruined by slow servers and download links with expiry dates/time i.e. if the download isn't completed with a specified number of minutes/hours/days, the link would expire and you'd have to start all over again from the beginning.
Transferring files you wish to download from the source server to another server that supports resume downloads and has no expiry date limits would be your best option in most cases as the files would remain permanently available to you for download at your own convenient time and pace.
If you're looking for a way of transferring large files from one server to another server for future downloads or you're moving your files from one web host to another, this tutorial is for you so enjoy.
NOTE: You must own a web server to make the most of this. I highly recommend Hostgator as I've been with them for over 2 years now without any problems whatsoever. You can even get your 1st month for just 1 cent only.
Contact your web host and ask them to activate SSH on your server.
Download and install puTTy on your computer and run the application. Enter your IP address and set Port to 2222 as shown in the image below. Select SSH and click on Open to begin your session.
NOTE: The IP address is your server's IP and if your web hosting control panel is cPanel, you can find the IP of your server in the bottom corner of the left sidebar.
For security reasons, I've blanked mine .
As shown below you'll be prompted to provide the login name and password of your FTP account. Simply do so and you'd be logged into the root directory of your server from where you'd have to navigate to the folder where you'd like to store your downloads.
Opening folders via command prompt is simple, use cd directory_name. In my case I'm storing all my downloads in public_html thus I used cd public_html.
Once you there, copy the direct link of the file, you'd like to download, type in wget space and right click to paste the link then hit enter
Example: wget http://google.com/google.pdf
The download will initiate immediately and depending on the speed of the cache of the server where you are downloading from, a 100MB file could be downloaded in 30 seconds to 1 minute.
HOW TO DOWNLOAD MULTIPLE FILES WITH WGET
If you have a lot of files to move from one server to another, simply paste all the links into a text file and upload it to your download folder then use the following command.
NOTE: Make sure the links are arranged 1 link per line.
Example: wget –i name_of_file.txt
HOW TO RESUME A DOWNLOAD WITH WGET VIA SSH
In the event that something happens and the file transfer process is interrupted and the file hosting server supports resume, you can resume your download with the following command
Example: wget –c http://google.com/google.pdf
If you'd like to resume the multiple files download process, use the following command
Example: wget –c -i name_of_file.txt
Do please note that most web hosting companies do not support the use of their servers for online file storage purposes so be sure to download your files and delete them as soon as you can.
Folks on Hostgator hosting risk losing their eligibility for backup once they pass a certain threshold of file size so tread carefully.
Feel free to ask me any question if you lose your way .
Share Whatsapp Twitter Facebook Google+
BBM Channel Enquiries (No Small Talk pls. Get straight to the point. Thanks): C000B896A