🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
-
Updated
May 16, 2024 - Python
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Build AI-powered applications with React, Svelte, Vue, and Solid
Awesome papers about unifying LLMs and KGs
Unify Efficient Fine-Tuning of 100+ LLMs
🔍 LLM orchestration framework to build customizable, production-ready LLM applications. Connect components (models, vector DBs, file converters) to pipelines or agents that can interact with your data. With advanced retrieval methods, it's best suited for building RAG, question answering, semantic search or conversational agent chatbots.
NucliaDB, The AI Search database for RAG
OSWorld: Benchmarking Multimodal Agents for Open-Ended Tasks in Real Computer Environments
TransfoRNA: Navigating the Uncertainties of Small RNA Annotation with an Adaptive Machine Learning Strategy
RWKV is an RNN with transformer-level LLM performance. It can be directly trained like a GPT (parallelizable). So it's combining the best of RNN and transformer - great performance, fast inference, saves VRAM, fast training, "infinite" ctx_len, and free sentence embedding.
A PyTorch-based Speech Toolkit
This repository contains a web application designed to execute relatively compact, locally-operated Large Language Models (LLMs).
A reading list for large models safety, security, and privacy.
LangSmith Client SDK Implementations
transformer implementation from scratch for next character prediction
Harness LLMs with Multi-Agent Programming
たまに追加される論文メモ
Enable everyone to develop, optimize and deploy AI models natively on everyone's devices.
A Chatbot complete with a GUI using the Gemini API.
The RunPod worker template for serving our large language model endpoints. Powered by vLLM.
Add a description, image, and links to the language-model topic page so that developers can more easily learn about it.
To associate your repository with the language-model topic, visit your repo's landing page and select "manage topics."