Skip to content

Machine learning model exposed through Flask restful REST Microservice for Real time prediction

License

Notifications You must be signed in to change notification settings

saurabh-slacklife/ml-data-model-microservice

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Data Science Model encapsulation in Python Microservice

The Flask REST Microservice deploys the Data science model into memory, which can then be used to get Recommendations through a HTTP GET Endpoint.

The service also exposes a HTTP GET through which a new trained model can be loaded into the memory.

made-with-python Version 3.7.5

Actions Status

Table of contents

Installation

Setup virtual env

####OS X & Linux:

# Install Virtualenv
pip3 install virtualenv

# Verify Virtualenv installation
virtualenv --version

# Create directory for virtual
mkdir -p ~/interpreter/python/3.7/

# Create virtualenv
cd ~/interpreter/python/3.7/
virtualenv -p /usr/bin/python3.7 venv3.7 #Ensure python 3.7 is located /usr/bin/python3.7, if not, then provide the path where python3.7 is installed.

# Activate virtualenv
source ~/interpreter/python/3.7/venv3.7/bin/activate

# Install the requirements in activated virtualenv
pip -r install requirements.txt

# It's recommended to freeze the env, you can do this with below:
pip freeze > requirements.txt

# Deactivate virtualenv
deactivate

Build and run

Using Docker

The service is Docker'zed. Below are the steps to build and run the Docker image.

# Below command builds the Docker image.
docker build -t data-model-service:v1 .

# Below command runs the docker image on port 5000.
# Sets the SERVICE_ENV environment variable in Docker container.
# The value "dev" is used to take Development configuration.
docker run -p 5000:5000 -e PORT=5000 -e SERVICE_ENV=dev data-model-service:v1

Using shell script

Run below commands to run the Microservice from shell script in background.

export SERVICE_ENV="dev" # Runs the application in Development configuration. Change to "qa" or "prod" based on environment.
chmod +x start.sh
nohup ./scripts/start.sh &

Logs

Local system

# Navigate to path /var/log/ml-price-recommendation-api
cd /var/log/ml-price-recommendation-api

# Gunicorn access logs path
tail -f /var/log/ml-price-recommendation-api/access.log

# Application log path
tail -f /var/log/ml-price-recommendation-api/application.log

Docker

# Find the docker CONTAINER_ID based on the Image tag: data-model-service:v1
docker ps | grep "data-model-service:v1" | cut -d" " -f1

# Access the docker shell
docker exec -it <CONTAINER_ID> /bin/sh

# Navigate to path /var/log/ml-price-recommendation-api
cd /var/log/ml-price-recommendation-api

# Gunicorn access logs path
tail -f /var/log/ml-price-recommendation-api/access.log

# Application log path
tail -f /var/log/ml-price-recommendation-api/application.log

Release History

  • 1.0.0
    • Released v1 of Data Model service.

Issue List

Current Issues

Contribute

If you want to be a contributor please follow the below steps.

  1. Fork it (https://github.com/saurabh-slacklife/ml-data-model-microservice/fork)
  2. Create your feature branch (git checkout -b feature/add-feature-xyz)
  3. Commit your changes (git commit -am 'Add some feature')
  4. Push to the branch (git push origin feature/add-feature-xyz)
  5. Create a new Pull Request

About

Machine learning model exposed through Flask restful REST Microservice for Real time prediction

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published