Skip to content
wzy edited this page Aug 16, 2023 · 7 revisions

Welcome to the llama.cpp wiki!

Install

yay -S llama-cpp
yay -S llama-cpp-cuda
yay -S llama-cpp-opencl

Nix

nix run github:ggerganov/llama.cpp
nix run 'github:ggerganov/llama.cpp#opencl'

NixOS

{ config, pkgs, ... }:
{
  nixpkgs.config.packageOverrides = pkgs: {
      llama-cpp = (
        builtins.getFlake "github:ggerganov/llama.cpp"
      ).packages.${builtins.currentSystem}.default;
    };
  };
  environment.systemPackages = with pkgs; [ llama-cpp ]
}

Android Termux

Wait https://github.com/termux/termux-packages/pull/17457.

apt install llama-cpp

Windows Msys2

pacman -S llama-cpp

Debian (Ubuntu)

git clone --depth=1 https://github.com/ggerganov/llama.cpp
cd llama.cpp
cmake -Bbuild
cmake --build build -D...
cd build
cpack -G DEB
dpkg -i *.deb

Redhat

git clone --depth=1 https://github.com/ggerganov/llama.cpp
cd llama.cpp
cmake -Bbuild
cmake --build build -D...
cd build
cpack -G RPM
rpm -i *.rpm

Users Guide

Useful information for users that doesn't fit into Readme.

Technical Details

These are information useful for Maintainers and Developers which does not fit into code comments

Clone this wiki locally