Information Retrieval in High Dimensional Data (class deliverables)
-
Updated
Aug 2, 2018 - Jupyter Notebook
Information Retrieval in High Dimensional Data (class deliverables)
Multi-Shot Approximation of Discounted Cost MDPs
Comparison of principal components analysis with diffusion maps on toy data sets and a molecular simulation trajectory
This repository contains the projects and homework from CSC 591 Graph Data Mining
To deal with non-linearly separable we use SVM's Kernel Trick which maps data to higher dimension!
PyTorch implementation of "Kernel Neural Optimal Transport" (ICLR 2023)
packer identification tool using SVM
University of Washington: CSE 446 (WIN '17) Machine Learning
My first steps to becoming AI engineer :)
Exploring meachine learning techniques and algorithms. Including clustering algorithms, perceptron and, more.
simple classifier with perceptron (university assignment)
Understanding of Support Vector Machines (SVMs) and their implementations. Covers linear and non-linear SVM, also explains how to use the "kernel trick" to handle non-linearly separable data.
A multiclass Support Vector Machine that supports kernel tricks has been implemented by solving the SVM quadratic programming problem.
A perceptron optimized with Numba that makes use of dual coordinates and kernel functions instead of classical dot product
Implemented a multi-class SVM that supports multi-core parallel computing
In this project I developed a program that generates nonlinearly separable data. This data is mapped into a higher dimension using a kernel trick where the data becomes linearly separable
This is a homework about supervised learning and kernel trick for machine learning course @ FUM.
Gaussian Processes for Machine Learning
Support Vector Machine-related projects
Add a description, image, and links to the kernel-trick topic page so that developers can more easily learn about it.
To associate your repository with the kernel-trick topic, visit your repo's landing page and select "manage topics."