Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Map /app directory as volume for Celery worker? #106

Open
IanSaucy opened this issue Aug 17, 2020 · 3 comments
Open

Map /app directory as volume for Celery worker? #106

IanSaucy opened this issue Aug 17, 2020 · 3 comments

Comments

@IanSaucy
Copy link

Hey,

I've been working on a smallish API using this project and ran into an interesting thing with the celery worker. As of right now, the docker-compose file does not seem to map the source code directory(/app) into the container. What this means is that the container either needs to be rebuilt or re-copy the code in if you update any of the celery tasks.

I'm wondering if it would be better to simply map the code as a volume in the same way as is done with the backend so that updates are automatically copied? This is the same thing that is done in the official fastapi version as well, see here.

On that note, it might also be good to add an auto-reload for the worker on file change, using something such as watchdog. Although, if that was done a production flag should probably be used to disable this behavior during production scenarios.

If either of these seem like something that would be a good fit I'm happy to throw together a PR.

@Buuntu
Copy link
Owner

Buuntu commented Aug 17, 2020

Hey,

I've been working on a smallish API using this project and ran into an interesting thing with the celery worker. As of right now, the docker-compose file does not seem to map the source code directory(/app) into the container. What this means is that the container either needs to be rebuilt or re-copy the code in if you update any of the celery tasks.

I'm wondering if it would be better to simply map the code as a volume in the same way as is done with the backend so that updates are automatically copied? This is the same thing that is done in the official fastapi version as well, see here.

On that note, it might also be good to add an auto-reload for the worker on file change, using something such as watchdog. Although, if that was done a production flag should probably be used to disable this behavior during production scenarios.

If either of these seem like something that would be a good fit I'm happy to throw together a PR.

Thanks for noticing this!

I like the first option you suggest, it seems the simplest and is probably sufficient for most cases. I would accept a PR to fix that. Watchdog may add unnecessary dependencies and I don't know if we should add this for everyone when only a few may benefit. Happy to add the watchdog suggestion into a README though for people that need more active Celery reloading, perhaps in a FAQ type section or similar?

@IanSaucy
Copy link
Author

Yeah, that is a good point about the extra dependency. I'll work on putting together a small PR for review that includes some of that info in the FAQ/Documentation.

@gmagno
Copy link

gmagno commented Dec 21, 2020

In case someone else hits this issue, here's what I'm doing at the moment (for reference purposes only):

requirements.txt:

# it is not very orthodox to loosen dependencies constraints,
# so take this with a pinch of salt, don't do this in production
alembic
Authlib
fastapi
celery==4.4.7  # downgrading celery to avoid issues with flower
redis
httpx
ipython
itsdangerous
Jinja2
psycopg2
pytest
requests
SQLAlchemy
starlette
uvicorn
passlib
bcrypt
sqlalchemy-utils
python-multipart
pyjwt

# watchdog and dependencies
pyyaml
argh
watchdog

docker-compose.yml:

...

  worker:
    build:
      context: backend
      dockerfile: Dockerfile
    volumes:
      - ./backend:/app/:cached
    # command: celery --app app.tasks worker --loglevel=DEBUG -Q main-queue -c 1
    command: watchmedo auto-restart --directory=./ --pattern=*.py --recursive -- celery --app app.tasks worker --loglevel=DEBUG -Q main-queue -c 1

...

In the Dockerfile the copy instruction may also be commented out:

FROM python:3.8

RUN mkdir /app
WORKDIR /app

RUN apt update && \
    apt install -y postgresql-client

COPY requirements.txt ./
RUN pip install --no-cache-dir -r requirements.txt

# COPY . .

Please let me know if I missed something.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants