Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

On Mac, Alpaca is stuck. Did I uninstall correctly? #469

Open
valeriarueda opened this issue Oct 23, 2023 · 0 comments
Open

On Mac, Alpaca is stuck. Did I uninstall correctly? #469

valeriarueda opened this issue Oct 23, 2023 · 0 comments

Comments

@valeriarueda
Copy link

Uploading Screenshot 2023-10-23 at 18.26.21.png…

On my Mac, Alpaca is stuck and does not reply. Llama does respond, but Alpaca does not (see screenshot: The first request was used Llama, the second used Alpaca)

How can I cleanly uninstall everything and try installing again? I deleted the "alpaca" folder inside the "dalai" folder; and re-installed alpaca (terminal command npx dalai alpaca install 7B). I am not sure this was the correct approach.

Specs: Mac M@ 16GB Ram, 1TB storage (>800GB left space)

See below what I had on the prompt after my second install :

mkdir /Users/MYUSER/dalai
{ method: 'install', callparams: [ '7B' ] }
2 [Error: ENOENT: no such file or directory, rename '/Users/MYUSER/dalai/alpaca/models' -> '/Users/MYUSER/dalai/tmp/models'] {
  errno: -2,
  code: 'ENOENT',
  syscall: 'rename',
  path: '/Users/MYUSER/dalai/alpaca/models',
  dest: '/Users/MYUSER/dalai/tmp/models'
}
3 [Error: ENOENT: no such file or directory, lstat '/Users/MYUSER/dalai/alpaca'] {
  errno: -2,
  code: 'ENOENT',
  syscall: 'lstat',
  path: '/Users/MYUSER/dalai/alpaca'
}
mkdir /Users/MYUSER/dalai/alpaca
try fetching /Users/MYUSER/dalai/alpaca https://github.com/ItsPi3141/alpaca.cpp
[E] Pull TypeError: Cannot read properties of null (reading 'split')
    at new GitConfig (/Users/MYUSER/.npm/_npx/3c737cbb02d79cc9/node_modules/isomorphic-git/index.cjs:1610:30)
    at GitConfig.from (/Users/MYUSER/.npm/_npx/3c737cbb02d79cc9/node_modules/isomorphic-git/index.cjs:1633:12)
    at GitConfigManager.get (/Users/MYUSER/.npm/_npx/3c737cbb02d79cc9/node_modules/isomorphic-git/index.cjs:1756:22)
    at async _getConfig (/Users/MYUSER/.npm/_npx/3c737cbb02d79cc9/node_modules/isomorphic-git/index.cjs:5467:18)
    at async normalizeAuthorObject (/Users/MYUSER/.npm/_npx/3c737cbb02d79cc9/node_modules/isomorphic-git/index.cjs:5477:19)
    at async Object.pull (/Users/MYUSER/.npm/_npx/3c737cbb02d79cc9/node_modules/isomorphic-git/index.cjs:11761:20)
    at async Dalai.add (/Users/MYUSER/.npm/_npx/3c737cbb02d79cc9/node_modules/dalai/index.js:394:7)
    at async Dalai.install (/Users/MYUSER/.npm/_npx/3c737cbb02d79cc9/node_modules/dalai/index.js:346:5) {
  caller: 'git.pull'
}
try cloning /Users/MYUSER/dalai/alpaca https://github.com/ItsPi3141/alpaca.cpp
next alpaca [AsyncFunction: make]
exec: make in /Users/MYUSER/dalai/alpaca
make
exit

The default interactive shell is now zsh.
To update your account to use zsh, please run `chsh -s /bin/zsh`.
For more details, please visit https://support.apple.com/kb/HT208050.
bash-3.2$ make
I llama.cpp build info: 
I UNAME_S:  Darwin
I UNAME_P:  arm
I UNAME_M:  arm64
I CFLAGS:   -I.              -O3 -DNDEBUG -std=c11   -fPIC -pthread -DGGML_USE_ACCELERATE
I CXXFLAGS: -I. -I./examples -O3 -DNDEBUG -std=c++11 -fPIC -pthread
I LDFLAGS:   -framework Accelerate
I CC:       Apple clang version 14.0.3 (clang-1403.0.22.14.1)
I CXX:      Apple clang version 14.0.3 (clang-1403.0.22.14.1)

cc  -I.              -O3 -DNDEBUG -std=c11   -fPIC -pthread -DGGML_USE_ACCELERATE   -c ggml.c -o ggml.o
c++ -I. -I./examples -O3 -DNDEBUG -std=c++11 -fPIC -pthread -c utils.cpp -o utils.o
c++ -I. -I./examples -O3 -DNDEBUG -std=c++11 -fPIC -pthread main.cpp ggml.o utils.o -o main  -framework Accelerate
./main -h
usage: ./main [options]

options:
  -h, --help            show this help message and exit
  -i, --interactive     run in interactive mode
  --interactive-start   run in interactive mode and poll user input at startup
  -r PROMPT, --reverse-prompt PROMPT
                        in interactive mode, poll user input upon seeing PROMPT
  --color               colorise output to distinguish prompt and user input from generations
  -s SEED, --seed SEED  RNG seed (default: -1)
  -t N, --threads N     number of threads to use during computation (default: 4)
  -p PROMPT, --prompt PROMPT
                        prompt to start generation with (default: random)
  -f FNAME, --file FNAME
                        prompt file to start generation.
  -n N, --n_predict N   number of tokens to predict (default: 128)
  --top_k N             top-k sampling (default: 40)
  --top_p N             top-p sampling (default: 0.9)
  --repeat_last_n N     last n tokens to consider for penalize (default: 64)
  --repeat_penalty N    penalize repeat sequence of tokens (default: 1.3)
  -c N, --ctx_size N    size of the prompt context (default: 2048)
  --temp N              temperature (default: 0.1)
  -b N, --batch_size N  batch size for prompt processing (default: 8)
  -m FNAME, --model FNAME
                        model path (default: ggml-alpaca-7b-q4.bin)

c++ -I. -I./examples -O3 -DNDEBUG -std=c++11 -fPIC -pthread quantize.cpp ggml.o utils.o -o quantize  -framework Accelerate
bash-3.2$ exit
exit
alpaca.add [ '7B' ]
dir /Users/MYUSER/dalai/alpaca/models/7B
downloading torrent
ggml-model-q4_0.bin 100%[==================================================================================>] done      

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant