Skip to content

how to deal with multi-armed bandit problem through different approaches

Notifications You must be signed in to change notification settings

rklymentiev/mab_problem

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

15 Commits
 
 
 
 
 
 

Repository files navigation

Solving Multi-Armed Problem

Repository consists of:

  • Notebook (mab_problem/notebook/multi_armed_bandit.ipynb) with explanations on how to deal with multi-armed problems through four different approaches:
  1. Random Selection
  2. Epsilon Greedy
  3. Thompson Sampling
  4. Upper Confidence Bound (UCB1)

Should be opened by Jupyter NBViewer in order to see the plots.

  • Flask app (mab_problem/flask_app) for interactive experience with 2 variants and 1000 trials.

App preview:

To run an app on your machine clone/download the repo and follow the commands:

$ cd mab_problem/flask_app
$ export FLASK_APP=app.py
$ flask run

About

how to deal with multi-armed bandit problem through different approaches

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published