Skip to content
/ payler Public

Temporize job execution, delay messages without asking your broker too much.

Notifications You must be signed in to change notification settings

tbobm/payler

Repository files navigation

Payler

Build Status

Renovate

The Payload Spooler

Send your payload now, treat it later.

What is this?

Payler is an asyncio-based Python application intended to provide a way of delaying message execution. The goal of this program is to reduce the workload on your existing message broker solution (Only RabbitMQ is currently supported, but other message-brokers can be easily implemented) by putting the payloads in a storage backend which will then be polled to re-inject payloads in the corresponding destination.

Installation

Through pypi:

$ pip install payler

Through poetry:

$ git clone https://github.com/tbobm/payler
$ cd payler
$ poetry install

How to use this

Using the command line:

  1. Specify the input and output URLs for your drivers (see configuration)
  2. (optional) Customize the configuration to suit your needs currently the example configuration is the only valid one
  3. Run payler payler --config-file configuration.yaml

Using the docker image:

  1. Pull the docker image docker pull ghcr.io/tbobm/payler:latest
  2. (optional) Customize the configuration to suit your needs currently the example configuration is the only valid one (mount the configuration file into the volume at /configuration.yaml)
  3. Run the docker image and provide environment variables docker run -d --name payler -e BROKER_URL="amqp://payler:secret@my-broker/" -e MONGODB_URL="mongodb://payler:secret@my-mongo/payler" ghcr.io/tbobm/payler

Configuration

In order to configure the different workflows, payler uses a configuration file (see configuration.yml).

Example config file:

---
workflows:
  - name: "Fetch payloads from RabbitMQ and store them in MongoDB"
    location: "payler"
    callable: "client.process_queue"
  - name: "Re-injects payloads to RabbitMQ"
    callable: "client.watch_storage"

The workflows[].name attribute is currently unused, but will offer a more human-friendly way of getting informed about a workflow's state. The workflows[].location corresponds to the package where the workflows[].callable can be found. It defaults to payler, but can this is a way of offering a dumb and simple plugin mechanism by creating function matching the following signature:

async def my_workflow(loop: asyncio.AbstractEventLoop) -> None:
    """My user-defined workflow."""
    # configure your driver(s)
    input_driver.serve()

Features

  • Forward messages between multiple datasources
  • Based on asyncio (benchmarks are on the roadmap)
  • Extend using your own implementation of the BaseDriver class

Drivers

driver process serve
BrokerManager Send a Payload to a Queue Consume a queue's messages
SpoolerManager Store a Payload in a Collection Fetch documents with a specific reference data

Testing

This project has unittests with pytest.

You can run the tests using:

poetry run pytest

Contributing

Feel free to open new issues for feature requests and bug reports in the issue page and even create PRs if you feel like it.

This project is linted with pylint with some minor adjustments (see the pyproject.toml).

Note

This side-project is born from the following:

  • I wanted to experiment with Python's asyncio
  • A friend of mine had issues with delaying lots of messages using RabbitMQ's delayed exchange plugin
  • I was looking for a concrete use-case to work with Github Actions.