This repository allows you to get started with a gui based training a State-of-the-art Deep Learning model with little to no configuration needed! NoCode training with TensorFlow has never been so easy.
-
Updated
May 25, 2024 - Python
This repository allows you to get started with a gui based training a State-of-the-art Deep Learning model with little to no configuration needed! NoCode training with TensorFlow has never been so easy.
A fast, easy-to-use, production-ready inference server for computer vision supporting deployment of many popular model architectures and fine-tuned models.
The simplest way to serve AI/ML models in production
The Qualcomm® AI Hub Models are a collection of state-of-the-art machine learning models optimized for performance (latency, memory etc.) and ready to deploy on Qualcomm® devices.
Train and predict your model on pre-trained deep learning models through the GUI (web app). No more many parameters, no more data preprocessing.
A Beautiful Flask Web API for Yolov7 (and custom) models
CLI & Python API to easily summarize text-based files with transformers
Unofficial (Golang) Go bindings for the Hugging Face Inference API
🤗 Hugging Face Inference Client written in Go
This repository allows you to get started with training a State-of-the-art Deep Learning model with little to no configuration needed! You provide your labeled dataset and you can start the training right away. You can even test your model with our built-in Inference REST API. Training classification models with GluonCV has never been so easy.
Text components powering LLMs & SLMs for geniusrise framework
Typescript wrapper for the Hugging Face Inference API.
Describing How to Enable OpenVINO Execution Provider for ONNX Runtime
Chat prompt template evaluation and inference monitoring
REST APIs for StableDiffusion. Inferencing support on AzureML
Yolo Inference API built using FastAPI
A Node.js backend that exposes a Typescript implementation of the deCheem inference engine.
Add a description, image, and links to the inference-api topic page so that developers can more easily learn about it.
To associate your repository with the inference-api topic, visit your repo's landing page and select "manage topics."