Using Docker Compose with Django


To use Docker and Docker Compose to develop a Django app locally, synchronizing data with an active project.


These instructions are presented as a demonstration of using Docker with for local development. DDEV with the integration is, however, the recommended local development tool.

See the Django DDEV documentation for more information.


You will need:

  • A local copy of the project repository on your computer.

    You can get one by running the following command: platform get PROJECT_ID . Or clone an integrated source repository and set the remote branch by running the following command: platform project:set-remote PROJECT_ID .

  • The CLI.

This example makes a few assumptions, which you may need to adjust for your own circumstances.

It assumes that you’ve already deployed a Django project on that has production data in a PostgreSQL database.

It’s assumed that database has the following service definition:

    type: postgresql:12
    disk: 1024

This is assumed to have the following relationship definition:

    database: "db:postgresql"

It’s assumed you want to run a built-in lightweight development server with runserver. To match a production web server (such as Gunicorn or Daphne), modify those commands accordingly.

Finally, this example mostly assumes that a is the primary remote for the project. When using source integrations, the steps will be identical in most cases and addressed otherwise.


Starting with a Dockerfile for Python 3.10:

FROM python:3.10
COPY . /code/
RUN python -m pip install --upgrade pip
RUN pip install -r requirements.txt

and a corresponding docker-compose.yaml file:

    image: postgres
      - ./data/db:/var/lib/postgresql/data
      - POSTGRES_DB=postgres
      - POSTGRES_USER=postgres
      - POSTGRES_PASSWORD=postgres
    command: >
      sh -c "python collectstatic --no-input &&
             python migrate &&
             python runserver"      
      - .:/code
      - "8000:8000"
      - POSTGRES_NAME=postgres
      - POSTGRES_USER=postgres
      - POSTGRES_PASSWORD=postgres
      - db

1. Create a new environment off of production.

platform branch new-feature main

If working from a source integration, it’s necessary for a merge/pull request environment to be opened beforehand. Otherwise, work from an existing staging environment, and sync to the active merge/pull request environment once it’s been activated.

2. Retrieve the inherited production data.

platform db:dump -e new-feature

This creates a database dump file with the format: PROJECT_ID--PLATFORM_BRANCH--SERVICE_NAME--dump.sql For merge/pull request environments, substitute new-feature for the name of the environment (for example, pr-42 ).

3. Ignore this file in future commits.

echo "*--dump.sql" >> .gitignore

4. Build the image.

docker-compose build

5. Start the container

docker-compose up -d

Verify that the containers are running:

$ docker-compose images
Container    Repository    Tag       Image Id       Size  
django_db_1    postgres     latest   9f3ec01f884d   378.6 MB
django_web_1   django_web   latest   0065da03c239   1.063 GB

6. Import the dump into the database.

docker exec -i $(docker-compose ps -q db) psql -U bigfoot < DATABASE--DUMP--NAME--dump.sql 


The section -i $(docker-compose ps -q db) is used to get the id for the db service container.

7. Shut down the containers.

When finished with your work, shut down the containers.

docker-compose down


You now have a local development environment that’s in sync with the new-feature environment on to work from. All of the steps above can be included in a and committed to the repository, ensuring that each new developer contributing to the project can quickly setup an identical environment to work from.

1 Like