Testing several hyperparameter optimization techniques.
-
Updated
Jul 20, 2020 - Python
Testing several hyperparameter optimization techniques.
This repository is an assignment I completed for CMPUT466. The goal is to compare different machine learning algorithms on a real task.
I implement I-AutoRec (an autoencoder framework for collaborative filtering b) with Keras and tuned hyperparameters of this model using a validation set.
textRec utlizes Latent Dirichlet Allocation and Jensen-Shannon-Divergence on the discrete probability distributions over LDA topics per document, in order to recommend unique and novel documents to specific users.
Sweep through ranges of command line hyperparameters to create testcases for multiple corners
Simple logging wrapper for model hyperparameters from gensim.d2v, sklearn and keras.
This project is part of the Udacity Azure ML Nanodegree. In this project, we build and optimize an Azure ML pipeline using the Python SDK and a provided Scikit-learn model. This model is then compared to an Azure AutoML run.
bagging and hyperparameter tuning on spam vs not spam dataset
Learn and practice several regularization techniques (including dropout regulation and hyperparameter-tuning) to improve model accuracy
Deep Learning concepts and techniques: Regularization, Epochs, Batch,Hyperparameters, Cross validation, Optimizers
Example of Neu.ro integration with NNI for hyperparameter tuning
A Hyperparameter Tuning algorithm.
Hyperparameter search algorithm
build and optimize an Azure ML pipeline using the Python SDK and a provided Scikit-learn model. This model is then compared to a model from Azure AutoML
Deploying Flight Price Prediction via Microsoft Azure
Housing Price Prediction -Advanced regression techniques
ML classification project.
Add a description, image, and links to the hyperparameters topic page so that developers can more easily learn about it.
To associate your repository with the hyperparameters topic, visit your repo's landing page and select "manage topics."