Tech Blog

Buckets stacked in rows

How to Use Amazon S3 Storage for Select Apps

Recently updated on

The django-storages module documentation assumes that you’re setting your storage configuration site-wide. This is how you configure it for a specific application.

At Imaginary Landscape, we often have the need to add remote file storage capabilities for our clients. At first glance, this sounds easy enough: "pip install django-storages", add the right credentials, and boom, you're done!

However, this was not the case for one particular situation we encountered. We didn't want remote storage for the entire site, only for certain apps. The django-storages module documentation assumes that you’re setting your storage configuration site-wide, so we had to figure out a way to get around this.

First, let's look at an example of what you would do in the case of using remote storage for your whole site.

#models.py
class GovernmentSecret(models.Model):
    secret_file = models.ImageField()
#settings.py
DEFAULT_FILE_STORAGE = 'storages.backends.s3boto.S3BotoStorage'
AWS_ACCESS_KEY_ID = 'xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx'
AWS_SECRET_ACCESS_KEY = 'xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx'
AWS_STORAGE_BUCKET_NAME="classified"

This works fine. However, it has now made S3 the global storage solution (so all of your files in every app will be stored on S3). But what if you had other models on your site and still wanted to store those files locally or on a different cloud storage provider?

Let's look at how we would go about that:

#models.py
class GovernmentSecret(models.Model):
    secret_file = models.ImageField(storage=storages.backends.s3boto.S3BotoStorage())

Now we are a bit closer. We're creating an instance of S3BotoStorage and passing that as the storage object for the storage field. However, there is still a problem with this. This storage object still points to the global storage location. What we need to do is instantiate the storage class with different parameters.

Here's how we do that:

#models.py
storage_backend = storages.backends.s3boto.S3BotoStorage(
        'bucket_name': 'my_bucket_name',
        'access_key': 'xxxxxxxxxxxxxxxxxxxxxxxxxxx',
        'secret_key': 'xxxxxxxxxxxxxxxxxxxxxxxxxxx',
        'default_acl': None
        )
class GovernmentSecret(models.Model):
    secret_file = models.ImageField(
        storage=storage_backend,
    )

And that should do it. Following this pattern, we can successfully upload any files to the remote storage of our choice on a case-by-case basis.

Comments

  1. apps on 05/24/2016 3:38 p.m.

    Hey! Would you mind if I share your blog with my zynga group?

    There's a lot of people that I think would really enjoy your content.
    Please let me know. Thank you

Comments are closed.