Control what LLMs can, and can't, say
-
Updated
May 1, 2024 - TypeScript
Control what LLMs can, and can't, say
Create an IRC chat bot powered by AI, using llamafile, in minutes.
This repository demonstrates LLM execution on CPUs using packages like llamafile, emphasizing low-latency, high-throughput, and cost-effective benefits for inference and serving.
A simple github actions script to build a llamafile and uploads to huggingface
Add a description, image, and links to the llamafile topic page so that developers can more easily learn about it.
To associate your repository with the llamafile topic, visit your repo's landing page and select "manage topics."