Skip to content

Text Classification model deployment using FastAPI, Streamlit and Docker Compose

Notifications You must be signed in to change notification settings

subhasisj/FastAPI-Streamlit-Docker-NLP

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Text-Classification with FastAPI / Streamlit / Docker Compose

This project showcases the use of FastAPI and Streamlit in tandem. The trained model is deployed using a FastAPI rest service containerized using Docker. The front end UI is built on Streamlit which is hosted on its own Docker container.

We spin both the containers together using Docker Compose .

GitHub Logo

How to use

Clone this repo and run the below docker command:

To Start Application:

docker-compose up -d --build

and navigate to http://localhost:8501/

To Stop Application:

docker-compose down

Trivia:

Using the volume tag in the compose file, we can mount the local folders from your computer to the Docker container. Now you can develop your app while using Docker and save changes.

About

Text Classification model deployment using FastAPI, Streamlit and Docker Compose

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published