Stochastic Dual Dynamic Programming in Julia
-
Updated
Jun 4, 2024 - Julia
Stochastic Dual Dynamic Programming in Julia
High Performance Map Matching with Markov Decision Processes (MDPs) and Hidden Markov Models (HMMs).
Python package for the simulation and estimation of a prototypical infinite-horizon dynamic discrete choice model based on Rust (1987)
A tool for parallel automated controller synthesis for large-scale stochastic systems.
The goal of this project is to build an RL-based algorithm that can help cab drivers maximize their profits by improving their decision-making process on the field. Taking long-term profit as the goal, a method is proposed based on reinforcement learning to optimize taxi driving strategies for profit maximization. This optimization problem is fo…
Markov Chain overview and their implementations in Finance
Solutions for course: "Applied Game Theory" taken at University of Novi Sad - Faculty of Technical Sciences
BRTDP implemented including DS-MPI for upper bound
The module covers the theory behind reinforcement learning and introduces Markov chains and Markov Decision Processes
MDPs solved using Value Iteration and Linear Programming
Learn to get started using DISCOTRESS with these tutorials! Then apply to your own Markov chains in ecology 🦜🌴 economics 💸📈 biophysics 🧬🦠 and more!
Implementations of methods in book <Reinforcement Learning: an introduction> by Sutton Barto, using Python.
A Yahtzee-solving python package and command line tool
Build an RL (Reinforcement Learning) agent that learns to play Numerical Tic-Tac-Toe. The agent learns the game by Q-Learning.
Build a rudimental NLP on pure statistic model
Final Project from the course "Probabilistic Machine Learning" @ Data Science & Scientific Computing, University of Trieste, year 2020/2021, written in ipynb.
R package for Discrete-Time Markov Decision Processes
Python3 library for visualizing high dimensional data.
Add a description, image, and links to the markov-decision-process topic page so that developers can more easily learn about it.
To associate your repository with the markov-decision-process topic, visit your repo's landing page and select "manage topics."