The current software development ecosystem moves fast and seems older frameworks like Django or Rails becoming absolute, but it is a huge underestimate! Django is one of the unique frameworks that I like to use first of all because it contains everything in it, with just a few simple configurations.

Let’s say you want to build a simple admin panel based on an existing DB and already predefined models. Even if project written in Node.js or with Go, I always use Django to spin up a basic Admin panel, because there is no UI coding needed, just a simple DB and python configurations.

On the other hand Django gives a lot of flexibility and simplicity in making API endpoints, especially when Django Graphene came out, we just made our full API with Django and GraphQL at https://treescale.com. Code maintenance is just a lot easier and flexible.

Docker with Django?

Generally python ecosystem has adopted Virtualenv to run python applications like when you want to have multiple python versions or multiple versions of a specific library, you had to run it under different virtual environments. With Docker things got a little bit easier, because Docker itself is a completely different environment and file system with all its libraries, and because of that, the Python ecosystem was one of the first adopters of containerized applications.

Running Django inside Docker is real deal breaker when it comes to real DevOps deployments or ease of resource management. So like it or not running Django inside Docker container is way better then trying to configure it with Virtualenv and managing code transfer separately.

Getting Started

Django application itself is just a well-structured python file with a single entry point. The rest of the functionality like the Django framework itself is stored inside python packages directory, and usually, when you are making Python projects over best practices you are keeping file requirements.txt to manage packages and their versions over pip.

Most probably your requirements.txt file looks similar to this

django==3.0.6
graphene-django==2.10.0
psycopg2===2.8.5
django-graphql-jwt==0.3.1
requests==2.23.0
django-cors-headers==3.2.1

This is what we have in one of our Django projects at TeeScale, and it is a very simple one, but the key part here is to keep specific package versions defined, otherwise, you can mess-up your Docker build and project execution in general. Sometimes packages get updated with some breaking changes, and while it is in your local development environment you are not seeing any issues with it, but when it gets to Docker build or production, it crashes. I’ve been there, and believe me it is not fun at all!

A key part of keeping this requirements.txt file is being able to automate the Dockerfile installation process, otherwise, you will have some hard times remembering which package gets used in your project to provide installation script over Docker build.

Very simple Docker build file looks something like this

FROM python:3-alpine
ADD . /api
WORKDIR /api

# You will need this if you need PostgreSQL, otherwise just skip this
RUN apk update && apk add postgresql-dev gcc python3-dev musl-dev libffi-dev
RUN pip install -r requirements.txt

EXPOSE 8000

CMD ["./manage.py", "runserver"]

This will build a basic Docker image with ./manage.py runserver default startup script for running the Django development server. Keep on reading, this is definitely not a production version!

UWSGI For production

Running Django in production can be challenging if you don’t know specific tools that are specifically designed for scalable Python server executions. In fact, one of the best Python servers is UWSGI which is powering large servers like Pinterest or Dropbox!

UWSGI is basically a concurrent Python execution server, which handles generic HTTP requests, and using OS’s native event-based networking principle simulates connection handling while Django will have time to respond to that request. It is very similar to Node.js’s request handling principle, which gives a huge opportunity to handle even a few 1000s requests per second with your Django application.

Simple Dockerfile with UWSGI looks like this

FROM python:3-alpine
ADD . /api
WORKDIR /api

# You will need this if you need PostgreSQL, otherwise just skip this
RUN apk update && apk add postgresql-dev gcc python3-dev musl-dev libffi-dev
RUN pip install -r requirements.txt
# Installing uwsgi server
RUN pip install uwsgi

EXPOSE 8000

# This is not the best way to DO, SEE BELOW!!
CMD uwsgi --http "0.0.0.0:8000" --module api.wsgi --master --processes 4 --threads 2

As you can see you can run your Django server by just specifying an application module like api.wsgi if your application name is api! Interestingly there is a specific configuration like how many processes to spin up for parallel execution and how many threads to support per process.

Usually I don’t like to write this type of large configurations commands inside a Dockerfile itself, so I’m making a file called runner.sh inside a root directory and adding everything that I need to start up an actual Django application. Like running predefined migrations or something else if I have to.

#!/usr/bin/env sh

# Getting static files for Admin panel hosting!
./manage.py collectstatic --noinput
uwsgi --http "0.0.0.0:${PORT}" --module api.wsgi --master --processes 4 --threads 2

Take a look at how I’m specifying an actual PORT to run the Django server as an environment variable, so that I can control it over Dockerfile or Docker execution environment.

So the final Dockerfile will look like this

FROM python:3-alpine

ADD . /api
WORKDIR /api

# You will need this if you need PostgreSQL, otherwise just skip this
RUN apk update && apk add postgresql-dev gcc python3-dev musl-dev libffi-dev
RUN pip install uwsgi
RUN pip install -r requirements.txt

ENV PORT=8000
EXPOSE 8000

# Runner script here
CMD ["/api/runner.sh"]

After building this Docker image you will get the most simple and efficient Django server for running in production and supporting a lot of loads!

What’s next?

This is just a Django server configuration, but you also going to have other microservices communicating with your Django server. This is where Docker configurations come handy! The way I like to set up is to start making Docker Compose services to configure everything together like this API Django server with PostgreSQL docker instance inside the same networking group as linked containers so that it wouldn’t be exposed publicly from the server, but Django would be able to communicate with PostgreSQL instance over basic networking link. Even when you running the Kubernetes cluster, you can simply do almost the same using Kubernetes services… aaa Anyway! I can talk a lot about the advantages of having a Docker container setup for almost any application, but the point of this article was to show the most simple Docker image configuration capable of running almost all Django versions supporting WSGI servers.

Stay tuned, don’t forget to follow me on twitter, medium with handle @tigranbs 🤟