Skip to content

jellyterra/llamash

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

llamash

RESTful API Bridge for Ollama.

Build and Run

$ go build .
$ ./llamash

Setup

Before starting the bridge server, you need a running Ollama server which the address is http://127.0.0.1:11434 in default.

$ podman run --network host ollama serve
$ ./llamash -p 11444 -i 'http://127.0.0.1:11434'

Enjoy it!

$ curl 'http://127.0.0.1:11444/generate?model=codellama&prompt=sayhi'

GET Form:

  • generate Generate content.
    • model The LLaMA model you gonna use.
    • prompt The content will send to the model.

Responds in pure text.

Releases

No releases published

Packages

No packages published

Languages