Copy large files between servers

There is an issue we faced while part of our database to another server far far away for performance and stability. However, SCP appears to transfer at 2MB/s where network was 10MB/s. we cancelled transfer and used alternatives.

What are alternatives you can use?

First, you can speed up SCP by turning on compression which will have a slight effect:

scp -C -o 'IPQoS throughput' -o 'CompressionLevel 9'  -c arcfour file

however, there seems to be some issue when transfer time passed 5 minutes, it just get slow and slower due to network latency. Rsync was not an option as we needed a parallel way.

If your both servers are in same data center, then either rsync or scp would do just fine, but in our case, We came up with Axel! it is not quite similar to others, but does job perfectly. It works excellent when data centers are far away.

Axel is a parallel HTTP downloader which support resume, and can download as fast as available bandwidth. Transfer time has been cut down to 30 minutes and 10MB/s (fastest our network support)

A simple command can such as:

axel -a -v -n 8 http://domain.com/somerandomfolder/largefile.zip

can do the job and use maximum capacity of your internet. However be aware that once you copy a file in your HTTP address it will be accessible to the world, make sure URL is kept secret and you delete file from URL as soon as download is finished.

You might need to install axel via

yum install axel
apt-get install axel

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.