How to Transfer Large Number of Files to DigitalOcean Spaces with rclone

Jian Jye
3 min readMar 8, 2021
Photo by Sam Moqadam on Unsplash

Say you have a few thousand files that you need to upload to DigitalOcean Spaces totalling up to a few gigabytes from either another Space, your local, or your VPS. And you are wondering what’s the fastest and easiest way to do so.

Previously, I would mount DigitalOcean Spaces using s3fs onto my VPS and started copying files over. But recently when I tried that on close to 5,000 files… It was just too slow.

Then I came across rclone. The speed difference is night and day when compared to s3fs. So here’s a guide on how to set it up on Ubuntu.

Step 1. Installations

Currently rclone is not on any package manager so the way to install it would be via an install script.

So let’s fetch the install script and execute it.

$ curl https://rclone.org/install.sh | sudo bash

Step 2. Generate Access Key for DigitalOcean Spaces

Go to https://cloud.digitalocean.com/account/api/tokens and generate a new Space Access Key if you do not have any. We’ll need them to authenticate ourselves later.

Remember to write down the key and secret once they are generated.

Step 3. Configure rclone

Open up ~/.config/rclone/rclone.conf, paste the following contents into it:

[<config-name>]
type = s3
env_auth = false
access_key_id = <key>
secret_access_key = <secret>
endpoint = sgp1.digitaloceanspaces.com
acl = private

Here <config-name> is any name to assign to this config. It does not have to correspond to your DigitalOcean settings.

Then <key> and <secret> are the credentials we obtained from Step 2 just now.

Step 4. Start Transferring!

First, we check if our config is working by attempting to list the files in our space. Note that <space-name> is the name per your DigitalOcean Space’s settings.

$ rclone ls <config-name>:<space-name>

If everything works properly, you should see the files from your spaces being listed here.

Now let’s start copying some files over:

$ rclone copy <local-folder> <config-name>:<space-name>or$ rclone copy <local-folder> <config-name>:<space-name>/sample-folder

Something to note here is that rclone works as if it’s copying the content of <local-folder>/* instead of the directory itself.

Which means that if you have a file under ~/example/test.txt and you executed rclone copy ~/example <config-name>:<space-name>, instead of seeing example inside your Space’s root directory, you will see test.txt.

So if you wanted to keep example as the folder, you should execute rclone copy ~/example <config-name>:<space-name>/example instead.

Bonus: Syncing Files Instead of Copying All

Now after transferring all the files, you might be tempted to tinker a little bit on DigitalOcean Spaces.

That might lead to extra files that you want to clean up, or some files that got deleted that you want to be re-uploaded.

Rclone comes with this nice command sync that can do just that! What it does is that it will detect the difference between your local folder and the destination folder at your spaces and any changes in the destination folder will be replaced.

This means that new files will be deleted, and deleted files will be re-uploaded.

To do that, just run this command:

$ rclone sync -i <local-folder> <config-name>:<space-name>/<destination-folder>

--

--

Jian Jye

I write about Laravel, PHP, and web development related articles.