Dockerized Developer Environment for Django Applications

Mohammed MIKOU
March 9th, 2020 · 3 min read

In this article we will cover the main concepts behind dockerizing an application. We will also illustrate steps to deploy and run the application in a development environment.

What is Docker?

Docker is a tool designed to facilitate the creation, deployment and execution of applications using containers. Containers allow a developer to package up an application with all of the parts it needs -such as libraries and other dependencies- and ship it all out as one package. By doing so, thanks to the container, we can rest assured that the application will run on any other machine.

In a way, Docker is a bit like a virtual machine. But unlike a virtual machine, rather than creating a whole virtual operating system, Docker allows applications to use the same Linux kernel as the system that they’re running on and only requires applications be shipped with things not already running on the host computer. This gives a significant performance boost and reduces the size of the application.

VM vs. Docker

There are numerous alternatives to Docker that also rely on the principle of containers but, in this article, we will exclusively cover the one that is most widely used in the software world i.e Docker.

Setup Docker

After installing docker from, we give Docker instructions on how we would like our application to run as a container. We do so in a special file called: Dockerfile

Create a docker image

In the root directory of our project we create the Dockerfile. A Dockerfile looks like this:

1FROM python:3.8.2-slim-buster
5RUN mkdir -p /opt/app
7WORKDIR /opt/app
9RUN apk --update add --no-cache bash curl-dev python3-dev \
10 libressl-dev gcc libgcc curl musl-dev \
11 make libpq postgresql-dev mariadb-dev
13RUN pip install -U pip setuptools
15COPY requirements.txt .
17RUN pip install -r requirements.txt
19ADD . .
21EXPOSE 8000

Here is the explanation of the file line by line:

1FROM python:3.8.2-slim-buster

In this example we are using a lightweight linux distribution container which has Python 3.8 already installed.

All docker python images are available in a docker hub at:


We can create all sort of Environment Variables using Env.

1RUN mkdir -p /opt/app

Running a command inside the container, here we create the folder where we will put our application.

1WORKDIR /opt/app

The WORKDIR command is used to define the working directory of a Docker container at any given time.

1RUN apk --update add --no-cache bash curl-dev python3-dev \
2 libressl-dev gcc libgcc curl musl-dev \
3 make libpq postgresql-dev mariadb-dev

In this line we run the command apk add proper to Linux alpine distribution which installs OS dependencies.

1RUN pip install -U pip setuptools

We install the latest version of pip and setuptools.

1COPY requirements.txt .

We copy the requirements file to the working directory.

1RUN pip install -r requirements.txt

We install our Django application requirements.

1ADD . .

We copy our Django application into the working directory.

1EXPOSE 8000

Expose command informs Docker that the container listens on the network ports 8000 at runtime.

What is Docker Compose?

As we all know modern applications need more than one service to work properly, we need for example a database, a cache and message broker, an asynchronous task queue, etc.

In order for all of those services to work we need Docker-compose which is a tool for defining and running multi-container Docker applications. With Compose, we use a YAML file to configure our application’s services. Then, with a single command, we create and start all the services from our configuration.

Create docker-compose file

In the root directory of our project we create the yml file

A docker-compose.yml file looks like this:

1version: "3.3"
3 app:
4 build:
5 context: .
6 restart: on-failure
7 ports:
8 - "8000:8000"
9 volumes:
10 - .:/opt/app
11 command: >
12 bash -c "python runserver"
13 env_file:
14 - docker-compose.env
15 depends_on:
16 - pgsql
18 pgsql:
19 image: pgsql:latest
20 environment:
23 - POSTGRES_DB=pgsqldb

In this example we create two services, an app service and a pgsql service. In order for the app to know which DB engine it should connect to, we have to setup this configuration in in our Django project.

2 'default': {
3 'ENGINE': 'django.db.backends.postgresql',
4 'NAME': 'pgsql',
5 'USER': 'root',
6 'PASSWORD': 'root',
7 'HOST': 'db',
8 'PORT': '5432',
9 }
10 }
2 context: .

Build the service image from the existing dockerfile.

1restart: on-failure

Configures if and how to restart containers when they exit.

2 -"8000:8000"

Used for mapping ports (HOST:CONTAINER).

2 - .:/opt/app

Mount host paths or named volumes, to a service. When we are writing a file on a project file this file is being written on your container as well and vice versa.

1command: >
2 bash -c "python runserver"

Runs this command when the container is running.

2 - docker-compose.env

Here we can specify the path to a file where we store all Environment variables that we want to be available in the container.

2 - pgsql

Express dependency between services, means that our app service will wait for pgsql container to run before running.

2 image: pgsql:latest

Specify the image to start the container from. Can either be a repository/tag or a partial image ID(available in

After we create our files, we can now run our dockerized application with the following commands:

1#docker-compose build

In order to build the containers.

1#docker-compose up

In order to run the services.

Before we run the application, we will need to run our migrations and load our static data. There are many ways doing it, but here we will explain the easiest one(in my opinion):

In the root directory we create an entry point file with:

2set -e
4python migrate --no-input
5python collectstatic --no-input

And we add run this file in the Dockerfile after expose command:

1EXPOSE 8000
3ENTRYPOINT ["sh", "/opt/app/"]


In this article, we provided a simple, quick tutorial on how to dockerize a Django application. We first configured our container image in a Dockerfile, then we specified the services needed for our Django webapp to run within a docker-compose.yml, and finally we used docker-compose commands to build and start our service containers.

More articles from Obytes

Store app secrets on ENV vars using AWS Secrets Manager and Terraform.

How to manage secrets of an application running on ECS as environment variables using AWS Secrets Manager and Terraform.

February 12th, 2020 · 2 min read

Manage logs of a Python app on AWS Fargate using Datadog Logs and CloudWatch.

How-to configure a Python (Django) on AWS Fargate app to send logs to both CloudWatch and Datadog using FireLens.

December 30th, 2019 · 3 min read


Our mission and ambition is to challenge the status quo, by doing things differently we nurture our love for craft and technology allowing us to create the unexpected.