Mount DigitalOcean Spaces with Linux
Introducing DigitalOcean Spaces
DigitalOcean Spaces is a new product from DigitalOcean, which offers a S3 compatible, flexible storage place for your data – and it’s much simpler to configure compared to Amazon’s S3 or other solutions i know.
Because uploading the whole stuff via a web based interfaces is really annoying, i tried to mount the storage with my Ubuntu Linux. Here’s the result, which may help you to success faster than i did.
Setup
First, you have to create a new “Spaces space” at DigitalOcean. After doing that, use the API menu entry in your DigitalOcean account to create (or get) API credentials for your new space. Create a new file named .passwd-s3fs in your home directory, set the privileges to 0600 (read and write access only for the owner) and enter the following content (similar to this tutorial):
digitalOceanSpaceName:key:secret
Now, install s3fs-fuse into your system. For Ubuntu you can simply use sudo apt-get install s3fs.
After successful setup, use this command to mount your space:
s3fs digitalOceanSpaceName /your/mount/path -ourl=https://nyc3.digitaloceanspaces.com
6 COMMENTS
This is looking really nice! Do you have some benchmarks on how fast this is? Is Spaces using WebDAV internally?
No, sorry, i have no benchmarks. What do you mean with WebDAV internally? Spaces uses an Amazon S3 compatible protocol for (client/external) communication.
Hi Matthias, thanks for sharing 🙂
I’m trying to test rw performance of a spaces mounted on a linux droplet fs via s3fs.
Performances are pretty low and “laggy” and we can’t understand why. We tried to play with s3fs cache and parallels connection with no success. Do you have any hint?
@Feed, What is laggy & low to you? How many milliseconds to save a file hello.txt with the text “hello world”?
I’d expect this to take 100? 500? ms depending on round trip time. Is that maybe what you mean with laggy & low performance? Repeat that 100 times to transfer a directory with 100 files and it’ll take almost a minute … instead of instantly with a local disk.
I agree with you. I am trying to migrate my static folder with big data (around 10 gb). Uploading is pretty low. After 5 minutes, 2 mb was uploaded. It is terrible.
Please help me.
I want to chown
chown -R www-data:www-data /var/www/abc
But Error:
Transport endpoint is not connected
Digitalocean -> Space