Skip to Content

New Blog Post: How to keep all your websites in sync with scraping technology

Technology Blog

Technology Blog

Multiple django storage backends with django-storages

Recently updated on

 

Recently here at iscape, we found the need to add remote storage capabilites for one of our clients. At first glance, this sounds easy enough. Pip install django-storages, add the right credentials, and boom! your done.

That was not the case for this particular situation tough. You see, we didn't want remote storage for the entire site, only for certain apps. django-storages was not built with having multiple storage backends in mind, so we had to figure out a way to get around this.

 

First of, lets look at an example of what you would do in the case of using remote storage for your whole site.

#models.py
class GovernmentSecret(models.Model):
secret_file = models.ImageField()

#settings.py
DEFAULT_FILE_STORAGE = 'storages.backends.s3boto.S3BotoStorage'
AWS_ACCESS_KEY_ID = 'xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx'
AWS_SECRET_ACCESS_KEY = 'xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx'
AWS_STORAGE_BUCKET_NAME="classified"

This works fine. However, it has now made S3 the global storage solution. (So all of your files will be stored on S3). But what if you had other models on your site, and still wanted to store those files locally (or on a different cloud storage provider).

 

Let's look at how we would go about that:

#models.py
from storages.backends.s3boto import S3BotoStorage

class GovernmentSecret(models.Model):
    secret_file = models.ImageField(storage=S3BotoStorage())

Here were a bit closer. We're creating an instance of S3BotoStorage, and passing that as the storage object for the storage field. There is still a problem with this tough. This storage object still points to the global storage location. What we need to do, is instantiate the storage class, with different parameters

 

Here's how we would do that:

#models.py
from storages.backends.s3boto import S3BotoStorage

class GovernmentSecret(models.Model):
    secret_file = models.ImageField(
        storage=S3BotoStorage(
            'bucket_name': 'osf-dev-physicians',
            'access_key': 'xxxxxxxxxxxxxxxxxxxxxxxxxxx',
            'secret_key': 'xxxxxxxxxxxxxxxxxxxxxxxxxxx',
            'default_acl': None
        )
    )

And that should do it. Following this pattern, we can succesfully upload any files to the remote storage of our choice


Share Twitter, LinkedIn, Facebook

Hey there...
Would you mind sharing this post? We would greatly appreciate it.
Thank you.