How to use Celery and Docker to handle periodic tasks in Django

How to use Celery and Docker to handle periodic tasks in Django

As you build and scale your Django applications, you’ll inevitably need to run certain tasks automatically and regularly in the background.

Some examples:

Generate regular reports

Clear the cache

Send bulk email notifications

Perform nightly maintenance

This is one of the few features required for building and extending web applications that is not part of Django core. Fortunately, Celery provides a powerful solution that is very easy to implement called Celery Beat.

In the following article, we will show you how to set up Django, Celery, and Redis with Docker to periodically run custom Django Admin commands via Celery Beat.

Dependencies:

Django v3.0.5

Docker v19.03.8

Python v3.8.2

Celery v4.4.1

Redis v5.0.8

Django + Celery series:

Asynchronous tasks with Django and Celery

Handling Periodic Tasks in Django with Celery and Docker (This article!)

Target

By the end of this tutorial, you should be able to:

Containerizing Django, Celery, and Redis with Docker

Integrate Celery into a Django application and create tasks

Writing custom Django Admin commands

Schedule custom Django Admin commands to run periodically via Celery Beat

Project Setup

Clone the base project from django-celery-beat repository, and checkout the base branch:

$ git clone
https://github.com/testdrivenio/django-celery-beat
--branch base --single-branch
$ cd django-celery-beat

Since we have a total of four processes to manage (Django, Redis, workers, and Scheduler), we will use Docker to simplify their workflow by connecting them together so that they can all be run from a single terminal window with a single command.

Create an image from the project root directory and start a Docker container:

$ docker-compose up -d --build
$ docker-compose exec web python manage.py migrate

Once the build is complete, navigate to http://localhost:1337 to ensure the app runs as expected. You should see the following text:

Orders
No orders found!

Project Structure:

├── .gitignore
├── docker-compose.yml
└── project
├── Dockerfile
├── core
│ ├── __init__.py
│ ├── asgi.py
│ ├── settings.py
│ ├── urls.py
│ └── wsgi.py
├── entrypoint.sh
├── manage.py
├── orders
│ ├── __init__.py
│ ├── admin.py
│ ├── apps.py
│ ├── migrations
│ │ ├── 0001_initial.py
│ │ └── __init__.py
│ ├── models.py
│ ├── tests.py
│ ├── urls.py
│ └── views.py
├── requirements.txt
└── templates
└── orders
└── order_list.html

Celery and Redis

Now, we need to add containers for Celery, Celery Beat, and Redis.

First, add the dependencies to the requirements.txt file:

Django==3.0.5
celery==4.4.1
redis==3.4.1

docker-compose.yml file content:

redis:
 image: redis:alpine
celery:
 build: ./project
 command: celery -A core worker -l info
 volumes:
 - ./project/:/usr/src/app/
 environment:
 -DEBUG=1
 - SECRET_KEY=dbaa1_i7%*3r9-=z-+_mz4r-!qeed@(-a_r(g@k8jo8y3r27%m
 - DJANGO_ALLOWED_HOSTS=localhost 127.0.0.1 [::1]
 depends_on:
 - redis
celery-beat:
 build: ./project
 command: celery -A core beat -l info
 volumes:
 - ./project/:/usr/src/app/
 environment:
 -DEBUG=1
 - SECRET_KEY=dbaa1_i7%*3r9-=z-+_mz4r-!qeed@(-a_r(g@k8jo8y3r27%m
 - DJANGO_ALLOWED_HOSTS=localhost 127.0.0.1 [::1]
 depends_on:
 - redis

We also need to update the depends_on section of the web service:

web:
 build: ./project
 command: python manage.py runserver 0.0.0.0:8000
 volumes:
 - ./project/:/usr/src/app/
 ports:
 - 1337:8000
 environment:
 -DEBUG=1
 - SECRET_KEY=dbaa1_i7%*3r9-=z-+_mz4r-!qeed@(-a_r(g@k8jo8y3r27%m
 - DJANGO_ALLOWED_HOSTS=localhost 127.0.0.1 [::1]
 depends_on:
 - redis # NEW

The complete docker-compose file is as follows:

version: '3.7'
 
services:
 web:
 build: ./project
 command: python manage.py runserver 0.0.0.0:8000
 volumes:
 - ./project/:/usr/src/app/
 ports:
 - 1337:8000
 environment:
 -DEBUG=1
 - SECRET_KEY=dbaa1_i7%*3r9-=z-+_mz4r-!qeed@(-a_r(g@k8jo8y3r27%m
 - DJANGO_ALLOWED_HOSTS=localhost 127.0.0.1 [::1]
 depends_on:
 - redis
 redis:
 image: redis:alpine
 celery:
 build: ./project
 command: celery -A core worker -l info
 volumes:
 - ./project/:/usr/src/app/
 environment:
 -DEBUG=1
 - SECRET_KEY=dbaa1_i7%*3r9-=z-+_mz4r-!qeed@(-a_r(g@k8jo8y3r27%m
 - DJANGO_ALLOWED_HOSTS=localhost 127.0.0.1 [::1]
 depends_on:
 - redis
 celery-beat:
 build: ./project
 command: celery -A core beat -l info
 volumes:
 - ./project/:/usr/src/app/
 environment:
 -DEBUG=1
 - SECRET_KEY=dbaa1_i7%*3r9-=z-+_mz4r-!qeed@(-a_r(g@k8jo8y3r27%m
 - DJANGO_ALLOWED_HOSTS=localhost 127.0.0.1 [::1]
 depends_on:
 - redis

Before we build a new container, we need to configure Celery in our Django application.

Celery Configuration

settings

In the "core" directory, create a celery.py file and add the following code:

import os
from celery import Celery
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "core.settings")
 
app = Celery("core")
app.config_from_object("django.conf:settings", namespace="CELERY")
app.autodiscover_tasks()

What's happening here?

First, we set a default value for the DJANGO_SETTINGS_MODULE environment variable so that Celery knows how to find the Django project.

Next, we created a new Celery instance with the name core and assigned that value to a variable called app.

We then loaded the Celery configuration values ​​from the settings object in django.conf. We use namespace="CELERY" to prevent conflicts with other Django settings. In other words, all configuration settings for Celery must be prefixed with CELERY_ .

Finally, app.autodiscover_tasks() tells Celery to look for Celery tasks from the applications defined in settings.INSTALLED_APPS .

Add the following code to core/__init__.py:

from .celery import app as celery_app
 
__all__ = ("celery_app",)

Finally, update the core/settings.py file with the following Celery settings so that it can connect to Redis:

CELERY_BROKER_URL = "redis://redis:6379"
CELERY_RESULT_BACKEND = "redis://redis:6379"

build:

$ docker-compose up -d --build

View the logs:

$ docker-compose logs 'web'
$ docker-compose logs 'celery'
$ docker-compose logs 'celery-beat'
$ docker-compose logs 'redis'

If all goes well, we now have four containers, each providing a different service.

Now we are ready to create a sample task to see if it works properly.

Create a task

Create a new file core/tasks.py and add the following code for an example task that just prints to the console:

from celery import shared_task

@shared_task
def sample_task():
 print("The sample task just ran.")

Scheduling tasks

At the end of the settings.py file, add the following code to schedule the sample_task to run every minute using Celery Beat:

CELERY_BEAT_SCHEDULE = {
 "sample_task": {
 "task": "core.tasks.sample_task",
 "schedule": crontab(minute="*/1"),
 },
}

Here we define a periodic task using the CELERY_BEAT_SCHEDULE setting. We named the task sample_task and declared two settings:

Task declares the task to be run.

A schedule sets the time intervals at which a task should run. This can be an integer, a time delta, or a crontab. We used crontab mode in our task, telling it to run every minute. You can find more information about Celery's schedule here.

Make sure to add the import:

from celery.schedules import crontab
 
import core.tasks

Restart the container to apply the changes:

$ docker-compose up -d --build

View the logs:

$ docker-compose logs -f 'celery'
celery_1 | -------------- [queues]
celery_1 | .> celery exchange=celery(direct) key=celery
celery_1 |
celery_1 |
celery_1 | [tasks]
celery_1 | . core.tasks.sample_task

We can see that Celery got the sample task core.tasks.sample_task.

Every minute, you should see a line ending with "The example task just ran" in the log:

celery_1 | [2020-04-15 22:49:00,003: INFO/MainProcess]
Received task: core.tasks.sample_task[8ee5a84f-c54b-4e41-945b-645765e7b20a]
celery_1 | [2020-04-15 22:49:00,007: WARNING/ForkPoolWorker-1] The sample task just ran.

Customizing Django Admin Commands

Django provides many built-in django-admin commands, such as:

migrate

Start a Project

startapp

Dumping Data

migrant

In addition to built-in commands, Django also provides us with the option to create our own custom commands:

Custom management commands are particularly useful for running stand-alone scripts or scripts that are executed periodically from a UNIX crontab or Windows Scheduled Tasks control panel.

Therefore, we will first configure a new command and then automatically run it using Celery Beat.

First create a new file called orders/management/commands/my_custom_command.py. Then, add the minimal code needed to run it:

from django.core.management.base import BaseCommand, CommandError
 
 
class Command(BaseCommand):
 help = "A description of the command"
 
 def handle(self, *args, **options):
 pass

BaseCommand has a few methods that can be overridden, but the only required method is handle. handle is the entry point of the custom command. In other words, when we run a command, this method will be called.

For testing purposes, we usually just add a quick print statement. However, it is recommended to use stdout.write instead according to the Django documentation:

When you are using management commands and want to provide console output, you should write to self.stdout and self.stderr rather than printing directly to stdout and stderr. By using these proxies, testing custom commands becomes much easier. Also note that you don't need to end the message with a newline, it will be added automatically unless you specify an ending parameter.

So, add a self.stdout.write command:

from django.core.management.base import BaseCommand, CommandError
 
 
class Command(BaseCommand):
 help = "A description of the command"
 
 def handle(self, *args, **options):
 self.stdout.write("My sample command just ran.") # NEW

test:

$ docker-compose exec web python manage.py my_custom_command
My sample command just ran.

With that, let’s tie it all together!

Scheduling custom commands with Celery Beat

Now that we have our container up and running, have tested that we can schedule tasks to run periodically, and have written our custom Django Admin example command, it’s time to set it up to run our custom command periodically.

settings

In the project we have a very basic application called Orders. It contains two models, Product and Order. Let's create a custom command that sends an email report confirming orders from the current day.

First, we'll add some products and orders to the database via the fixtures included in this project:

$ docker-compose exec web python manage.py loaddata products.json

Create a superuser:

$ docker-compose exec web python manage.py createsuperuser

When prompted, fill in your username, email, and password. Then navigate to http://127.0.0.1:1337/admin in your web browser. Log in with the superuser you just created and create a few orders. Make sure at least one date is today.

Let's create a new custom command for our email report.

Create a file called orders/management/commands/email_report.py:

from datetime import timedelta, time, datetime
 
from django.core.mail import mail_admins
from django.core.management import BaseCommand
from django.utils import timezone
from django.utils.timezone import make_aware
 
from orders.models import Order
 
today = timezone.now()
tomorrow = today + timedelta(1)
today_start = make_aware(datetime.combine(today, time()))
today_end = make_aware(datetime.combine(tomorrow, time()))
 
 
class Command(BaseCommand):
 help = "Send Today's Orders Report to Admins"
 
 def handle(self, *args, **options):
 orders = Order.objects.filter(confirmed_date__range=(today_start, today_end))
 
 if orders:
 message = ""
 
 for order in orders:
 message += f"{order} \n"
 
 subject = (
 f"Order Report for {today_start.strftime('%Y-%m-%d')}"
 f"to {today_end.strftime('%Y-%m-%d')}"
 )
 
 mail_admins(subject=subject, message=message, html_message=None)
 
 self.stdout.write("E-mail Report was sent.")
 else:
 self.stdout.write("No orders confirmed today.")

In the code, we queried the database for orders with a date of confirmed_date, combined the orders into a single message for the email body, and then sent the email to the administrators using Django’s built-in mail_admins command.

Add a dummy admin email and set EMAIL_BACKEND to use the console backend so that the email is sent to stdout in the settings file:

EMAIL_BACKEND = "django.core.mail.backends.console.EmailBackend"
DEFAULT_FROM_EMAIL = "[email protected]"
ADMINS = [("testuser", "[email protected]"), ]

run:

$ docker-compose exec web python manage.py email_report
Content-Type: text/plain; charset="utf-8"
MIME-Version: 1.0
Content-Transfer-Encoding: 7bit
Subject: [Django] Order Report for 2020-04-15 to 2020-04-16
From: root@localhost
To: [email protected]
Date: Wed, 15 Apr 2020 23:10:45 -0000
Message-ID: <158699224565.85.8278261495663971825@5ce6313185d3>

Order: 337ef21c-5f53-4761-9f81-07945de385ae - product: Rice

-------------------------------------------------------------------------------
E-mail Report was sent.

Celery Beat

Now, we need to create a periodic task to run this command every day.

Add a new task to core/tasks.py:

from celery import shared_task
from django.core.management import call_command # NEW
 
 
@shared_task
def sample_task():
 print("The sample task just ran.")
 
 
#NEW
@shared_task
def send_email_report():
 call_command("email_report", )

So first we added a call_command import which is used to call django-admin commands programmatically. In the new task, then use call_command as argument with the name of your custom command.

To schedule this task, open the core/settings.py file, and update the CELERY_BEAT_SCHEDULE setting to include the new task.

CELERY_BEAT_SCHEDULE = {
 "sample_task": {
 "task": "core.tasks.sample_task",
 "schedule": crontab(minute="*/1"),
 },
 "send_email_report": {
 "task": "core.tasks.send_email_report",
 "schedule": crontab(hour="*/1"),
 },
}

Here, we added a new entry to CELERY_BEAT_SCHEDULE called send_email_report. As we did with the previous task, we declare the task that this task should run - for example, core.tasks.send_email_report - and set the recurrence using crontab mode.

Restart the container to ensure the new settings are active:

$ docker-compose up -d --build
Look at the log:
$ docker-compose logs -f 'celery'
celery_1 | -------------- [queues]
celery_1 | .> celery exchange=celery(direct) key=celery
celery_1 |
celery_1 |
celery_1 | [tasks]
celery_1 | . core.tasks.sample_task
celery_1 | . core.tasks.send_email_report

One minute later the email was sent:

celery_1 | [2020-04-15 23:20:00,309: WARNING/ForkPoolWorker-1] Content-Type: text/plain; charset="utf-8"
celery_1 | MIME-Version: 1.0
celery_1 | Content-Transfer-Encoding: 7bit
celery_1 | Subject: [Django] Order Report for 2020-04-15 to 2020-04-16
celery_1 | From: root@localhost
celery_1 | To: [email protected]
celery_1 | Date: Wed, 15 Apr 2020 23:20:00 -0000
celery_1 | Message-ID: <158699280030.12.8934112422500683251@42481c198b77>
celery_1 |
celery_1 | Order: 337ef21c-5f53-4761-9f81-07945de385ae - product: Rice
celery_1 | [2020-04-15 23:20:00,310: WARNING/ForkPoolWorker-1] -------------------------------------------------------------------------------
celery_1 | [2020-04-15 23:20:00,312: WARNING/ForkPoolWorker-1] E-mail Report was sent.

in conclusion

In this article, we walked you through setting up Docker containers for Celery, Celery Beat, and Redis. We then showed how to use Celery Beat to create a custom Django Admin command and a periodic task to run that command automatically.

Original article: https://testdriven.io/blog/django-celery-periodic-tasks/

This is the end of this article on how to use Celery and Docker to handle periodic tasks in Django. For more information about Celery Docker handling Django periodic tasks, please search for previous articles on 123WORDPRESS.COM or continue to browse the following related articles. I hope you will support 123WORDPRESS.COM in the future!

You may also be interested in:
  • Django+Celery implements dynamic configuration of scheduled tasks
  • Django implements Celery scheduled task process analysis
  • Process analysis of using celery and Django to handle asynchronous tasks
  • Django integrates celery to send asynchronous email examples
  • Django implements Celery to dynamically set the execution time of periodic tasks
  • Django uses Celery - How to execute time-consuming programs in Celery
  • How to save the results of Celery task execution in Django
  • How to use Celery to perform Django serial asynchronous tasks
  • Django-celery-beat dynamically adds periodic tasks to implement process analysis

<<:  Analysis of the principle and usage of MySQL continuous aggregation

>>:  Vite introduces the implementation of virtual files

Recommend

MySQL 5.7 installation and configuration tutorial

This article shares the MySQL installation and co...

WeChat Mini Program User Authorization Best Practices Guide

Preface When developing WeChat applets, you often...

jQuery implements HTML element hiding and display

Let's imitate Taobao's function of displa...

Simple usage example of MySQL 8.0 recursive query

Preface This article uses the new features of MyS...

Detailed process of changing apt source to Alibaba Cloud source in Ubuntu 18.04

Table of contents Preface: Ubuntu 18.04 changes a...

Let's talk in depth about the principle and implementation of new in JS

Table of contents definition Constructor bodies a...

Mysql optimization tool (recommended)

Preface While browsing GitHub today, I found this...

Summary of Button's four Click response methods

Button is used quite a lot. Here I have sorted ou...

VMware virtual machine installation CentOS 8 (1905) system tutorial diagram

The world-famous virtual machine software VMware-...

Install centos7 virtual machine on win10

1. Download VMware Workstation 64 version https:/...

The connection between JavaScript constructors and prototypes

Table of contents 1. Constructors and prototypes ...

Six ways to reduce the size of Docker images

Since I started working on Vulhub in 2017, I have...

HTML table markup tutorial (14): table header

<br />In HTML language, you can automaticall...

Example code for configuring monitoring items and aggregated graphics in Zabbix

1. Install Zabbix Agent to monitor the local mach...