Vowpal Wabbit is a machine learning system which pushes the frontier of machine learning with techniques such as online, hashing, allreduce, reductions, learning2search, active, and interactive learning.
-
Updated
May 11, 2024 - C++
Vowpal Wabbit is a machine learning system which pushes the frontier of machine learning with techniques such as online, hashing, allreduce, reductions, learning2search, active, and interactive learning.
TF-Agents: A reliable, scalable and easy to use TensorFlow library for Contextual Bandits and Reinforcement Learning.
Python implementations of contextual bandits algorithms
Open Bandit Pipeline: a python library for bandit algorithms and off-policy evaluation
[IJAIT 2021] MABWiser: Contextual Multi-Armed Bandits Library
Online Deep Learning: Learning Deep Neural Networks on the Fly / Non-linear Contextual Bandit Algorithm (ONN_THS)
👤 Multi-Armed Bandit Algorithms Library (MAB) 👮
Contextual Bandits in R - simulation and evaluation of Multi-Armed Bandit Policies
A lightweight contextual bandit & reinforcement learning library designed to be used in production Python services.
🐈⬛ Contextual bandits library for continuous action trees with smoothing in JAX
Blocks World -- Simulator, Code, and Models (Misra et al. EMNLP 2017)
Code accompanying the paper "Learning Permutations with Sinkhorn Policy Gradient"
Contextual bandit algorithm called LinUCB / Linear Upper Confidence Bounds as proposed by Li, Langford and Schapire
implement basic and contextual MAB algorithms for recommendation system
Code for our ACML and INTERSPEECH papers: "Speaker Diarization as a Fully Online Bandit Learning Problem in MiniVox".
Privacy-Preserving Bandits (MLSys'20)
Contextual Multi-Armed Bandit Platform for Scoring, Ranking & Decisions
Study of the paper 'Neural Thompson Sampling' published in October 2020
Implementation of provably Rawlsian fair ML algorithms for contextual bandits.
lightweight contextual bandit library for ts/js
Add a description, image, and links to the contextual-bandits topic page so that developers can more easily learn about it.
To associate your repository with the contextual-bandits topic, visit your repo's landing page and select "manage topics."