You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
2024-04-14T16:54:13+0800 [ERROR] [runner:llm-mistral-runner:1] An exception occurred while instantiating runner 'llm-mistral-runner', see details below:
2024-04-14T16:54:13+0800 [ERROR] [runner:llm-mistral-runner:1] Traceback (most recent call last):
File "/Users/billyle/.pyenv/versions/3.12.2/lib/python3.12/site-packages/bentoml/_internal/runner/runner.py", line 307, in init_local
self._set_handle(LocalRunnerRef)
File "/Users/billyle/.pyenv/versions/3.12.2/lib/python3.12/site-packages/bentoml/_internal/runner/runner.py", line 150, in _set_handle
runner_handle = handle_class(self, *args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/billyle/.pyenv/versions/3.12.2/lib/python3.12/site-packages/bentoml/_internal/runner/runner_handle/local.py", line 27, in __init__
self._runnable = runner.runnable_class(**runner.runnable_init_params) # type: ignore
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/billyle/.pyenv/versions/3.12.2/lib/python3.12/site-packages/openllm/_runners.py", line 165, in __init__
self.llm, self.config, self.model, self.tokenizer = llm, llm.config, llm.model, llm.tokenizer
^^^^^^^^^
File "/Users/billyle/.pyenv/versions/3.12.2/lib/python3.12/site-packages/openllm/_llm.py", line 459, in model
model = openllm.serialisation.load_model(self, *self._model_decls, **self._model_attrs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/billyle/.pyenv/versions/3.12.2/lib/python3.12/site-packages/openllm/serialisation/__init__.py", line 63, in caller
return getattr(importlib.import_module(f'.{serde}', 'openllm.serialisation'), fn)(llm, *args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/billyle/.pyenv/versions/3.12.2/lib/python3.12/site-packages/openllm/serialisation/transformers/__init__.py", line 97, in load_model
auto_class = infer_autoclass_from_llm(llm, config)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/billyle/.pyenv/versions/3.12.2/lib/python3.12/site-packages/openllm/serialisation/transformers/_helpers.py", line 33, in infer_autoclass_from_llm
raise ValueError(
ValueError: Invalid configuration for mistralai/Mistral-7B-Instruct-v0.1. ``trust_remote_code=True`` requires `transformers.PretrainedConfig` to contain a `auto_map` mapping
2024-04-14T16:54:13+0800 [ERROR] [runner:llm-mistral-runner:1] Traceback (most recent call last):
File "/Users/billyle/.pyenv/versions/3.12.2/lib/python3.12/site-packages/starlette/routing.py", line 732, in lifespan
async with self.lifespan_context(app) as maybe_state:
File "/Users/billyle/.pyenv/versions/3.12.2/lib/python3.12/contextlib.py", line 210, in __aenter__
return await anext(self.gen)
^^^^^^^^^^^^^^^^^^^^^
File "/Users/billyle/.pyenv/versions/3.12.2/lib/python3.12/site-packages/bentoml/_internal/server/base_app.py", line 75, in lifespan
on_startup()
File "/Users/billyle/.pyenv/versions/3.12.2/lib/python3.12/site-packages/bentoml/_internal/runner/runner.py", line 317, in init_local
raise e
File "/Users/billyle/.pyenv/versions/3.12.2/lib/python3.12/site-packages/bentoml/_internal/runner/runner.py", line 307, in init_local
self._set_handle(LocalRunnerRef)
File "/Users/billyle/.pyenv/versions/3.12.2/lib/python3.12/site-packages/bentoml/_internal/runner/runner.py", line 150, in _set_handle
runner_handle = handle_class(self, *args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/billyle/.pyenv/versions/3.12.2/lib/python3.12/site-packages/bentoml/_internal/runner/runner_handle/local.py", line 27, in __init__
self._runnable = runner.runnable_class(**runner.runnable_init_params) # type: ignore
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/billyle/.pyenv/versions/3.12.2/lib/python3.12/site-packages/openllm/_runners.py", line 165, in __init__
self.llm, self.config, self.model, self.tokenizer = llm, llm.config, llm.model, llm.tokenizer
^^^^^^^^^
File "/Users/billyle/.pyenv/versions/3.12.2/lib/python3.12/site-packages/openllm/_llm.py", line 459, in model
model = openllm.serialisation.load_model(self, *self._model_decls, **self._model_attrs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/billyle/.pyenv/versions/3.12.2/lib/python3.12/site-packages/openllm/serialisation/__init__.py", line 63, in caller
return getattr(importlib.import_module(f'.{serde}', 'openllm.serialisation'), fn)(llm, *args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/billyle/.pyenv/versions/3.12.2/lib/python3.12/site-packages/openllm/serialisation/transformers/__init__.py", line 97, in load_model
auto_class = infer_autoclass_from_llm(llm, config)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/billyle/.pyenv/versions/3.12.2/lib/python3.12/site-packages/openllm/serialisation/transformers/_helpers.py", line 33, in infer_autoclass_from_llm
raise ValueError(
ValueError: Invalid configuration for mistralai/Mistral-7B-Instruct-v0.1. ``trust_remote_code=True`` requires `transformers.PretrainedConfig` to contain a `auto_map` mapping
Describe the bug
new install of openllm on python 3.12.2
running cmd:
TRUST_REMOTE_CODE=True openllm start mistralai/Mistral-7B-Instruct-v0.1
To reproduce
brew install pyenv
git clone https://github.com/pyenv/pyenv-virtualenv.git $(pyenv root)/plugins/pyenv-virtualenv
echo 'eval "$(pyenv virtualenv-init -)"' >> ~/.zshrc
pyenv install 3.12.2
pyenv global 3.12.2
pip install openllm
TRUST_REMOTE_CODE=True openllm start mistralai/Mistral-7B-Instruct-v0.1
Logs
Environment
Environment variable
System information
bentoml
: 1.1.11python
: 3.12.2platform
: macOS-14.4.1-arm64-arm-64bituid_gid
: 501:20pip_packages
System information (Optional)
No response
The text was updated successfully, but these errors were encountered: