Uncertainty quantification fo ML - collection of scripts, tutorials and templates
-
Updated
Jul 16, 2023 - Jupyter Notebook
Uncertainty quantification fo ML - collection of scripts, tutorials and templates
A repo for toy examples to test uncertainties estimation of neural networks
Probabilistic framework for solving Visual Dialog
Official repository for the paper "Masksembles for Uncertainty Estimation" (CVPR2021).
Simple and efficient way of performing deep ensembling to improve robustness as well as estimate uncertainty
Wasserstein dropout (W-dropout) is a novel technique to quantify uncertainty in regression networks. It is fully non-parametric and yields accurate uncertainty estimates - even under data shifts.
Behaviour Cloning of Cartpole Swing-up Policy with Model-Predictive Uncertainty Regularization (UW CSE571 Guided Project)
A repository about Robust Deep Neural Networks with Uncertainty, Local Competition and Error-Correcting-Output-Codes in TensorFlow.
Official Code: Trust Your Robots! Predictive Uncertainty Estimation of Neural Networks with Sparse Gaussian Processes
Code for "Deal: Deep Evidential Active Learning for Image Classification" (ICMLA 2020)
This repository contains code and resources for my thesis project on uncertainty estimation in computed tomography (CT) scan modeling. Explore Bayesian and deterministic neural network architectures for CT analysis and compare their effectiveness in quantifying uncertainty.
Guided Perturbations: Self-Corrective Behavior in Convolutional Neural Networks
An implementation of natural parameter networks and its extension to GRUs in PyTorch
A neural-network based image classifier that quantifies its uncertainty using Bayesian methods, as described in Kendall and Gal (2017)
NeurIPS paper 'Censored Quantile Regression Neural Networks for Distribution-Free Survival Analysis'
Attempt to reproduce the toy experiment of http://bit.ly/2C9Z8St with an ensemble of nets and with dropout.
RBF SVM based wrong prediction estimator in deep learning models employed for CPS data
UAP-BEV: Uncertainty Aware Planning in Bird's Eye View Generated from Monocular Images (CASE 2023)
The second-moment loss (SML) is a novel training objective for dropout-based regression networks that yields improved uncertainty estimates.
Add a description, image, and links to the uncertainty-neural-networks topic page so that developers can more easily learn about it.
To associate your repository with the uncertainty-neural-networks topic, visit your repo's landing page and select "manage topics."