Skip to content

aslakvs/INF2220-A2

Repository files navigation

Installation

  1. First, the model must be downloaded: Navigate to /models and run:
wget https://huggingface.co/TheBloke/Llama-2-7B-Chat-GGML/resolve/main/llama-2-7b-chat.ggmlv3.q4_0.bin
  1. Build the docker image using the provided Dockerfile:
docker build -t gordon_ramsai . 
  1. Run a docker container from the image:
docker run --gpus all --ipc=host --ulimit memlock=-1 --ulimit stack=67108864 --rm -it -v $HOME/data:/data -p 50031:50031/tcp gordon_ramsai

Step 3 must be run through a vscode terminal to use the proxy

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published