transfer-files-on-multiple-remote-server

I don’t have time to write tutorials for a long time. Sometimes it’s too busy for me, but most of the time, I was just too lazy to write. Today, I will share a simple bash script which helps to copy the folder with thousands of files from remote Linux to my Windows laptop. First, it’s usually running a batch file (.bat file) on Windows, but I am not familiar with batch file. For executing Linux bash script on Windows, there are lots of way. I am using the “Git Bash” which is installed when I install the Git client in Windows.

Copy Folder from Remote Linux to Windows

The simplest way to copy folder from remote Linux to Windows is by using “scp”. It will be available when we install the putty client. For example, I want to download the folder “a” from remote Linux to Windows with “scp”. So I can run this:

scp -r root@10.0.0.1:/root/a ./

This command is very efficient when the folder size is small. However, I am facing a big problem today when I download a folder with 10k files inside. The network is very poor. It took around 3 hours to download around 1k files, then the download process was broken. I got a message about “the connection is reset by remote server”. If I repeat the above command, it will start from the beginning and it still has a risk of network disconnection. This is the reason to cause me writing a script to download big folders which has thousands of files from remote Linux to my local Windows or Linux.

Download Big Folder from Remote Linux to Local

Basically, the script will still take the advantage of ssh and scp command. Instead of downloading the whole folder, the script will download the file one by one from the remote. Meanwhile, it will check if the file exists. Here is the example code:

#!/bin/sh
folder=a
for file in $(ssh root@remote-linux-ip "cd ~/$folder;ls;")
do
	if [ ! -f "./$folder/$file" ]; then
		scp root@remote-linux-ip:~/$folder/$file ./$folder/$file
	else
		echo "$file exists"
	fi
done

Best Script to SSH Download Thousands Files from Remote Server to Local

To improve the script, I add several features in the script. The latest script will allow us to set remote path, local path, and the sub-folders to download. I also add some checking code before downloading. Here is the final version source code:

#!/bin/sh
#remote server ip
ip=192.168.0.1

#remote path where to copy from
remotePath=/root/data

#local path where to copy to
localPath=.

#sub-folders what to copy from remote path to local path
folderList=(
    folder1
    folder2
    )

# check local parent folder if exist
if [ ! -d "$localPath" ]; then
    echo "target folder $localPath does not exist!"
    exit 0
fi

# loop in folder list
for folder in ${folderList[@]}
do
    # create folder if not exist
    if [ ! -d "$localPath/$folder" ]; then
        echo "$localPath/$folder does not exist, mkdir"
        mkdir $localPath/$folder
    fi
    
    # check remote folder if exist
    exist=$(ssh root@$ip "[ -d "$remotePath/$folder" ]&&echo 1||echo 0")
    if [ "$exist" -eq "1" ]; then
        echo "remote folder $remotePath/$folder exist, start to copy"
        
        exist=0
        notexist=0
        for file in $(ssh root@$ip "cd $remotePath/$folder;ls;")
        do
            if [ ! -f "$localPath/$folder/$file" ]; then
                notexist=$(($notexist + 1))
                scp root@$ip:$remotePath/$folder/$file $localPath/$folder/$file
            else
                exist=$(($exist + 1))
            fi
        done

        echo "$notexist new files"
        echo "$exist old files"
    else
        echo "remote folder $remotePath/$folder does not exist"
    fi
done

By using this script, please make sure that you can ssh to the remove server without password. If you don’t know how, it’s better to ask here.

Copy Large File By SCP

Sometimes we need to copy very large file from one server to another, like database migration. Usually, the db files are about hundred gigabytes. One single file could be up to 20-30G. If you want to transfer one file to another server, which is not located in the same local network. It could take several hours. An internet connection can be unpredictable at times, and a sudden drop of the connection while downloading a large file can be frustrating.

One way to solve this is setting up a web server or ftp server. So you can download the files from old server to new server. Most of the http/ftp download softwares support to resume the interrupted downloading task.

Another way to transfer big file from one server to another is splitting the file in to small blocks. Usually I am using tar to make a big file and then split it into 1G blocks. The reason to use tar is that the tool has a feature to verify the blocks when I am extracting the file from package. If some of the block files are damaged, I can find it and resend it from original server. Here is the command to create a tar package, compressed and split it into small blocks.

tar -zcvf package.tar.gz files
split -b 1G package.tar.gz package.tar.gz- -d -a 3

The above commands will create a package.tar.gz file which is compressed by gzip, and then split the package.tar.gz into 1G blocks with file name package.tar.gz-001, package.tar.gz-002, etc. After sending all files to the target server, then you can use following command to combine the files to extract the tar package.

cat package.tar.gz-* > package.tar.gz
tar -zxvf package.tar.gz
Previous PostNext Post

Leave a Reply

Your email address will not be published. Required fields are marked *