Figuring out how to set up Django with Docker, PostgreSQL, and React can be a complex task, especially if you, like me, don’t really code much nowadays. The boilerplate effort around configuring these technologies is often the first thing I forget with each re-immersion. This article serves as a reminder for how to kickstart a Python web project using Django, PostgreSQL, Docker, and React. So you don’t have to figure it out each time you just want to build a simple website. While tools like cookiecutter-django offer a solid alternative, there is something invaluable about the hands-on approach that truly cements how these pieces fit together.
We’ll take an iterative approach, introducing new components with each section. First, we’ll set up a basic Django project using a virtual environment. We will then containerize the project with Docker, integrate PostgreSQL as a robust database solution, and finally, incorporate React to build a dynamic frontend. This step-by-step guide ensures a comprehensive understanding of how to set up Django with Docker, PostgreSQL, and React, creating a modern web application.
Prerequisites
Before diving into setting up your Django project, make sure you have the following prerequisites in place:
- Python: Ensure you have Python installed on your system. Download it from the official Python website.
- PyCharm (or another IDE): We’ll be using PyCharm in this tutorial, but you can use any other integrated development environment (IDE) of your choice.
- Pipenv: This tool helps manage project dependencies and virtual environments. Install Pipenv by running:
pip install pipenv
Setting Up the Project
With the prerequisites in place, let’s start setting up our Django project.
Set up Django
When starting a new Django project, the first step is to set up a solid foundation using a virtual environment. This isolates your project’s dependencies, ensuring that packages required for one project don’t interfere with others. We’ll use Pipenv, a tool that combines package management and virtual environment handling, to streamline this process.
Create a New PyCharm Project
- Name Your Project: Start by naming your project. This is the initial setup where you define your project’s identity.
- Select the Python Virtual Environment Tool: Choose Pipenv. This helps in managing dependencies and creating isolated environments.
- Choose the Base Interpreter: Select the Python version you want to use for your project.
About Pipenv
Your project will include a base Pipfile, which contains all the packages needed to run the project. Pipenv is a powerful tool that combines the functionality of virtual environments with dependency management, making it easier to maintain consistent development environments.
Install Django
With the virtual environment ready, the next step is to install Django. Open the terminal within PyCharm and run the following command:
pipenv install django
This command installs Django within the virtual environment, ensuring that your project has access to the necessary libraries.
Create a Django Project
Now, let’s set up the initial structure of our Django project. Run these commands in the terminal:
django-admin startproject config .
./manage.py migrate
./manage.py runserver
By default, Django uses SQLite as the database backend. After running migrations, you’ll see a db document in your project folder, indicating that the database is set up and ready to use. You can now access your locally running website at http://127.0.0.1:8000/.
Next, we’re going to build on this project by using Docker, defining a container within which our project will deploy and run.
Set up Django with Docker
While Pipenv is great for managing dependencies and virtual environments, Docker takes things a step further by providing complete isolation for your application, ensuring it runs consistently across different environments. Docker allows you to package your application with all its dependencies, including the operating system, into a single container. This is particularly useful for development, testing, and deployment, as it eliminates the “it works on my machine” problem.
Install Docker
First, we need to install Docker:
- Download Docker: Get Docker from its official website.
- Follow the installation instructions: Install Docker based on your operating system.
- Ensure Docker is running: Docker should be running as a background process on your computer.
Create a Dockerfile
A Dockerfile is a script that contains a series of instructions on how to build a Docker image for your application. Create a file named Dockerfile at the root of your project with the following content:
# Use the official Python image from the Docker Hub
FROM python:3.11
# Prevents Python from writing .pyc files to disk
ENV PYTHONDONTWRITEBYTECODE 1
# Ensures that the output of the Python application is sent straight to the terminal
ENV PYTHONUNBUFFERED 1
# Set the working directory inside the Docker container
WORKDIR /app
# Install pipenv for managing dependencies and virtual environments
RUN pip install pipenv
# Ensure pipenv uses the specified Python version (Python 3.11) for the virtual environment
RUN pipenv --python /usr/local/bin/python3.11
# Copy the entire project directory to the working directory in the Docker container
COPY . /app
# Install pipenv and then use it to install the dependencies defined in the Pipfile
RUN pipenv install --system --deploy
- FROM: Specifies the base image (Python 3.11) to use.
- ENV PYTHONDONTWRITEBYTECODE: Prevents Python from writing .pyc files, keeping the container clean.
- ENV PYTHONUNBUFFERED: Ensures all output is sent directly to the terminal.
- WORKDIR: Sets the working directory inside the container to /app.
- COPY: Copies the current directory’s content to the /app directory inside the container.
- RUN: Installs Pipenv and the dependencies listed in the Pipfile.
Build the Docker Image
To create the Docker image, run the following command from the root of your project:
docker build .
Create docker-compose.yml
Docker Compose allows you to define and run multi-container Docker applications. Create a file named docker-compose.yml at the root of your project with the following content:
version: '3.8'
services:
web:
build: .
command: python manage.py runserver 0.0.0.0:8000
volumes:
- .:/app
ports:
- "8000:8000"
- version: Specifies the version of the Docker Compose file format.
- services: Defines the services that make up your application.
- web:
- build: Builds the Docker image using the Dockerfile in the current directory.
- command: Runs the Django development server inside the container.
- volumes: Mounts the current directory to the /app directory inside the container, enabling live code reloading.
- ports: Maps port 8000 on the container to port 8000 on the host machine.
Start the Django Project in Docker
Finally, start the Docker container by running:
docker-compose up
What we’ve done here is define a Python-based Docker image for our project. We copy the project into that image, install the Python dependencies, and then start the development server inside the container. This setup ensures that your development environment is consistent and isolated, making it easier to manage dependencies and avoid conflicts.
By using Docker, we’ve created a consistent and reproducible environment for our Django project, ensuring it runs the same way on any system with Docker installed.
Managing Docker Containers
Here are some essential Docker commands to manage your containers and images:
Stopping the Docker Container
To stop the Docker container, you can use:
docker-compose down
This command stops and removes the containers defined in your docker-compose.yml file.
Rebuilding the Docker Image
You should rebuild the Docker image if you make changes to the Dockerfile or the dependencies listed in your Pipfile. To rebuild the image, run:
docker-compose up --build
This command rebuilds the Docker image and starts the containers.
Deleting Docker Containers, Images, and Volumes
To clean up Docker resources, you might need to remove containers, images, or volumes. Here are some useful commands:
Remove a specific container:
docker rm <container_id>
Removing a specific image:
docker rmi <image_id>
Or a specific volume:
docker volume rm <volume_name>
Pruning all stopped containers, dangling images, and unused networks:
docker system prune
Set up Django with Docker and PostgreSQL
To make your Django application production-ready, we need to add support for PostgreSQL, a robust and widely-used database system. Instead of installing PostgreSQL locally, we’ll run it within a Docker container, maintaining our isolated and consistent development environment.
Install PostgreSQL Drivers in Docker
To enable Django to communicate with PostgreSQL, we need to install the PostgreSQL drivers. With the Docker container up and running, execute the following:
docker-compose exec web pipenv install psycopg2-binary
This installs psycopg2-binary, a PostgreSQL database adapter for Python, into the Docker container.
Add PostgreSQL Support to the Docker Services
Update docker-compose.yml to include a db service. Also, to ensure your PostgreSQL data persists even if the container is stopped or removed, update docker-compose.yml to include a volume for PostgreSQL data:
version: '3.8'
services:
db:
image: postgres
environment:
- "POSTGRES_HOST_AUTH_METHOD=trust"
volumes:
- postgres_data:/var/lib/postgresql/data
web:
build: .
command: python manage.py runserver 0.0.0.0:8000
volumes:
- .:/app
ports:
- "8000:8000"
depends_on:
- db
volumes:
postgres_data:
- db: Defines the PostgreSQL service using the official PostgreSQL image.
- environment: Sets the POSTGRES_HOST_AUTH_METHOD to trust for simplicity.
- depends_on: Ensures the web service waits for the db service to start.
- volumes: Mounts a volume named postgres_data to persist data in /var/lib/postgresql/data inside the container.
Update Django Settings for PostgreSQL
Next, configure Django to use PostgreSQL. In config/settings.py, update the DATABASES setting:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': 'postgres',
'USER': 'postgres',
'PASSWORD': 'postgres',
'HOST': 'db',
'PORT': '5432',
}
}
This configuration tells Django to connect to the PostgreSQL database running in the db container.
Rebuild and Start Docker Containers
After making these changes, we need to rebuild the Docker image and restart the containers:
docker-compose down
docker-compose up --build
Run Django Migrations and Create Superuser
Apply the database migrations and create a superuser to access the Django admin interface:
docker-compose exec web python manage.py migrate
docker-compose exec web python manage.py createsuperuser
By adding PostgreSQL support, we’ve enhanced our Django project, making it more suitable for production environments. Running PostgreSQL in a Docker container ensures that our database setup remains consistent and isolated, just like our application environment.
Now, you should be able to log in to the admin portal on your Docker-running web site at http://localhost:8000/admin/.
Set up Django with Docker, PostgreSQL and React
In this section, we will configure our Django application to work with React. This involves setting up JavaScript and assets, compiling them into a bundle, and serving them from our development server.
Install Node and NPM
Before setting up JavaScript and React, you’ll need to have Node.js and npm installed on your system. npm is the package manager for Node.js and is required for installing JavaScript dependencies.
First, download Node.js: Visit the official Node.js website and download the LTS (Long-Term Support) version for your operating system. This version includes npm.
Then, install Node.js, following the instructions for your operating system. For Windows, run the installers and follow the prompts. For Macs, download the .pkg installer and follow the prompts or alternately install with Homebrew (brew install node).
Set Up JavaScript
First, create a new subfolder under the root of your project to store JavaScript files, /assets.
Then, run the following commands in the root folder to set up the necessary Node packages and modules:
npm init -y
npm install webpack webpack-cli --save-dev
Create a Sample JavaScript File
Create assets/index.js with the following content:
function component() {
const element = document.createElement('div');
element.innerHTML = 'Hello Beautiful';
return element;
}
document.body.appendChild(component());
This basic JavaScript file will serve as our entry point for Webpack.
Create Webpack Configuration File
Create webpack.config.js with the following content:
const path = require('path');
module.exports = {
entry: './assets/index.js',
output: {
filename: 'index-bundle.js',
path: path.resolve(__dirname, './static'),
},
};
This configuration tells Webpack to take index.js as the entry point and output the bundled file as index-bundle.js in the static directory.
Update package.json
Add the following script to package.json:
"scripts": {
"dev": "webpack --mode development --watch"
}
This script runs Webpack in development mode and watches for changes to recompile automatically.
Run Webpack
To compile the JavaScript files, run:
npm run dev
Configure Django to Use Static Files
Update the Django settings file to include the configuration for static files:
import os
STATIC_URL = '/static/'
STATICFILES_DIRS = [
os.path.join(BASE_DIR, 'static'),
]
STATIC_ROOT = os.path.join(BASE_DIR, 'staticfiles')
This tells Django where to find and serve static files.
Install Babel
To enable ES6 and JSX syntax, install Babel:
npm install --save-dev babel-loader @babel/core @babel/preset-env @babel/preset-react
Update Webpack Configuration for Babel
Update webpack.config.js to include Babel configuration:
const path = require('path');
module.exports = {
entry: './assets/index.js',
output: {
filename: 'index-bundle.js',
path: path.resolve(__dirname, './static'),
},
module: {
rules: [
{
test: /\.(js|jsx)$/,
exclude: /node_modules/,
loader: 'babel-loader',
options: { presets: ['@babel/preset-env', '@babel/preset-react'] },
},
{
test: /\.css$/,
use: [
'style-loader',
{
loader: 'css-loader',
options: {
modules: true,
},
},
],
},
],
},
};
Install React
Finally, install React and ReactDOM:
npm install --save react react-dom
By following these steps, you have configured your Django project to work seamlessly with React, allowing you to build a modern front-end application while leveraging Django’s robust back-end capabilities. This setup ensures your JavaScript files are compiled and served efficiently, providing a smooth development experience.
Ok, But What’s Node??
As a more traditional ‘back-end’ developer, some of this new Javascript stack is indeed confusing. Here’s one way of thinking about it.
Node.js is a runtime environment that allows you to run JavaScript on the server side, essential for tools like npm, Webpack, and Babel. npm is the package manager for Node.js, used to install and manage dependencies. Webpack bundles your JavaScript files and assets, compiling them into a single file. Babel transforms modern JavaScript and JSX syntax into a format compatible with older browsers. React is a library for building user interfaces. Together, these tools enable you to create a modern front-end application with React, manage dependencies efficiently, and ensure compatibility, while your Django backend serves the bundled output.
- Node.js is like the Python interpreter itself. Just as Node.js runs JavaScript code, the Python interpreter runs Python code.
- npm (Node Package Manager) is similar to pip (Python Package Installer). Both are used to install and manage libraries and dependencies for their respective languages.
- Webpack is analogous to Django’s staticfiles app or tools like whitenoise. While Webpack bundles JavaScript and assets, Django’s staticfiles app collects and serves static files (CSS, JavaScript, images).
- Babel is comparable to transpilers or compilers in other contexts, but for Python, you might think of tools like 2to3, which convert Python 2 code to Python 3, ensuring compatibility.
- React is akin to Django templates or Jinja2 templates, but more interactive. While Django templates render HTML on the server side, React components render dynamic interfaces on the client side.
I found this article on Modern JavaScript for Django Developers critically helpful along the way.
Testing the React Integration
In this section, we’ll test that React is working by creating a new URL endpoint, setting up a basic HTML template, and configuring Django to serve the bundled JavaScript file.
Create a New URL Endpoint
First, add a new URL endpoint in your Django project’s urls.py file to serve a template where React will be mounted. Add the following line:
from django.urls import path
from django.views.generic import TemplateView
urlpatterns = [
path('base', TemplateView.as_view(template_name='base.html')),
# other paths...
]
This creates a new URL endpoint base that renders the base.html template.
Create the Base HTML Template
Next, create the base.html template in your templates directory. If you don’t already have a templates folder, create one in the root of your project. Inside the templates folder, create a file named base.html with the following content:
{% load static %}
<!doctype html>
<html>
<head>
<title>Why is this so complicated?</title>
</head>
<body>
<div id="root"></div>
<script src="{% static 'index-bundle.js' %}"></script>
</body>
</html>
This HTML template loads the index-bundle.js file that Webpack generates and provides a div with the id root where React will mount.
Configure Django to Look for Templates
Ensure that Django knows where to look for your templates. In your project’s settings.py file, add the templates directory to the TEMPLATES setting:
TEMPLATES = [
{
'BACKEND': 'django.template.backends.django.DjangoTemplates',
'DIRS': [BASE_DIR / 'templates'], # Make sure this path is correct
'APP_DIRS': True,
'OPTIONS': {
'context_processors': [
'django.template.context_processors.debug',
'django.template.context_processors.request',
'django.contrib.auth.context_processors.auth',
'django.contrib.messages.context_processors.messages',
],
},
},
]
This configuration tells Django to look for templates in the templates directory.
Update Dockerfile to Collect Static Files
Finally, update your Dockerfile to collect static files when building the Docker image. Add the following line to your Dockerfile:
# Other RUN commands...
# Collect static files
RUN python manage.py collectstatic --noinput
This command collects all static files into the location specified by STATIC_ROOT.
Verify the Setup
Rebuild and Start Docker Containers: After making these changes, rebuild your Docker image and start the containers:
docker-compose down
docker-compose up --build
Access the New Endpoint: Open your web browser and navigate to http://localhost:8000/base. You should see a page with “Hello Beautiful” displayed, indicating that the React integration is working correctly.
Development Workflow
Here’s a sample workflow for using all this stuff to develop.
- Run npm run dev locally:
- This command will start webpack in watch mode, so it monitors your JavaScript files for changes and automatically rebuilds them.
- The generated files will be placed in the /static directory as specified in the webpack.config.js.
- Mount the Code to Docker:
- Since your code is mounted to Docker (as specified in the docker-compose.yml file), any changes made locally will be reflected inside the Docker container.
- This includes the updated JavaScript files in the /static directory.
- Docker Setup:
- Ensure Docker is running and your containers are up by using docker-compose up.
There are likely many alternatives to this setup. One, for example, would be having Docker build the Javascript bundle, incorporating the build steps into the Dockerfile and ensuring the Docker container runs webpack. I readily admit I have been unable to make this work!
Auto-Run Migrations
Adding an entrypoint script to handle database readiness and run migrations before starting the Django server is a good practice. It ensures that your application doesn’t start before the database is ready and that any necessary migrations are applied automatically. Here’s how you can integrate this into your Docker setup:
Update Dockerfile
Add the following lines to the end of your Dockerfile:
# Install dependencies
RUN apt-get update \
&& apt-get install -y netcat-openbsd
# Copy the entrypoint script
COPY entrypoint.sh /entrypoint.sh
# Make the entrypoint script executable
RUN chmod +x /entrypoint.sh
# Set the entrypoint script to run when the container starts
ENTRYPOINT ["/entrypoint.sh"]
Create the entrypoint.sh File
Create a file named entrypoint.sh in the root of your project with the following content:
#!/bin/sh
# Wait for the database to be ready
echo "Waiting for PostgreSQL to start..."
while ! nc -z db 5432; do
sleep 0.1
done
echo "PostgreSQL started"
# Run Django migrations
echo "Running migrations"
python manage.py migrate --noinput
# Start the Django app
exec "$@"
By adding the entrypoint script, you ensure that the Django server waits for the PostgreSQL database to be ready and that migrations are applied before starting the application. This approach improves the robustness and reliability of your Dockerized Django project.
Debugging
There’s a good chance you’ll want to run your Python code directly from your IDE such as for debugging while still connecting to the PostgreSQL database running inside a Docker container. Here’s how you can set this up.
Expose PostgreSQL in Docker
First, ensure your docker-compose.yml file is configured to expose the PostgreSQL service. This setup allows your local environment to access the database running in the Docker container.
version: '3.8'
services:
db:
image: postgres
environment:
- "POSTGRES_HOST_AUTH_METHOD=trust"
ports:
- "5433:5432" # Expose the PostgreSQL port
volumes:
- postgres_data:/var/lib/postgresql/data
web:
build: .
command: python manage.py runserver 0.0.0.0:8000
volumes:
- .:/app
ports:
- "8000:8000"
depends_on:
- db
volumes:
postgres_data:
This configuration exposes the PostgreSQL port 5432 on your localhost, making it accessible to applications running on your local machine.
Update Django Settings
Next, update your Django settings to connect to the PostgreSQL database exposed on localhost when running locally. We’ll need to start using environment variables to accomplish this. First, add a new debug configuration into PyCharm that executes manage.py runserver, but adds a new Environment Variable, DJANGO_ENV=debug.
Then, in your settings file, update the database settings to look for this new environment variable:
import os
DJANGO_ENV = os.environ.get('DJANGO_ENV', 'production') # Defaults to 'production' if not set
if DJANGO_ENV == 'debug':
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': 'postgres',
'USER': 'postgres',
'PASSWORD': 'postgres',
'HOST': 'localhost',
'PORT': '5433', # Use port 5433 for local debug environment
}
}
else:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': 'postgres',
'USER': 'postgres',
'PASSWORD': 'postgres',
'HOST': 'db', # Docker Compose service name as the host
'PORT': '5432', # Default PostgreSQL port
}
}
This tells Django to connect to the PostgreSQL database on localhost:5433 when running in debug mode.
Run the PostgreSQL Container
Start just the PostgreSQL container using Docker Compose. Run the following command:
docker-compose up db
This command starts the PostgreSQL container without starting the Django container.
Run and Debug
Some IDEs will let you remote connect to a running service. In this particular example, we’re going to run a new instance locally and connect to the remote db. Assuming a clean local virtual environment, you may need to install the db driver locally:
pip install psycopg2-binary
Now you can run your Django project from the IDE. It will connect to the PostgreSQL database running inside the Docker container. You can also use the IDE’s debugging tools to set breakpoints and debug your application as usual.
Summary
In this guide, we’ve walked through how to set up Django with Docker, PostgreSQL, and React. Starting from the basics of project setup with Pipenv and Django, we moved on to containerizing our application with Docker for consistent development and production environments. We then upgraded our database to PostgreSQL, and integrated React, utilizing Webpack and Babel to manage our JavaScript assets efficiently.
It’s a lot to sort out when all you really wanted was a website. I hope someone finds this useful!