Introductory Kaggle competition
-
Updated
Jun 13, 2016 - Jupyter Notebook
Introductory Kaggle competition
Bayesian hyperparameters optimization for neural networks
Easily declare large spaces of (keras) neural networks and run (hyperopt) optimization experiments on them.
Gnarl - An easy to use Deep Learning framework for Python
Tuning XGBoost hyper-parameters with Simulated Annealing
Adventures using keras on Google's Cloud ML Engine
Optimise Hyperparameter/Configurationparameter with Java
Some of experiences in Machine Learning field
Population Based Training (in PyTorch with sqlite3). Status: Unsupported
Simple parameter space creation library for python
Multiple Reinforcement learning techniques on 3x3 TicTacToe
Simple logging wrapper for model hyperparameters from gensim.d2v, sklearn and keras.
Spark Parameter Optimization and Tuning
How optimizer and learning rate choice affects training performance
Argload, easy reloading of command line arguments
Tuning hyperparams fast with Hyperband
Hyperparameter search algorithm
A design space exploration tool for deep neural architectures
Final project for CS 412 Machine Learning at University of Illinois at Chicago
Add a description, image, and links to the hyperparameters topic page so that developers can more easily learn about it.
To associate your repository with the hyperparameters topic, visit your repo's landing page and select "manage topics."