Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Some errors:ValueError: Tokenizer class InternLMXComposerTokenizer does not exist or is not currently imported. #308

Open
Shuweis opened this issue May 9, 2024 · 1 comment
Assignees

Comments

@Shuweis
Copy link

Shuweis commented May 9, 2024

I run the quick inference demo of the github repo and follow the install pipeline without any other operation.
But I got this error.
Traceback (most recent call last):
File "/home/ssw/DemoFusion/base_models/InternLM-XComposer/projects/ShareGPT4V/demo_share4v.py", line 24, in
eval_model(args)
File "/home/ssw/DemoFusion/base_models/InternLM-XComposer/projects/ShareGPT4V/share4v/eval/run_share4v.py", line 31, in eval_model
tokenizer, model, image_processor, context_len = load_pretrained_model(
File "/home/ssw/DemoFusion/base_models/InternLM-XComposer/projects/ShareGPT4V/share4v/model/builder.py", line 114, in load_pretrained_model
tokenizer = AutoTokenizer.from_pretrained(
File "/home/ssw/anaconda3/envs/share4v/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py", line 699, in from_pretrained
raise ValueError(
ValueError: Tokenizer class InternLMXComposerTokenizer does not exist or is not currently imported.

Hope for your reply!

@Shuweis
Copy link
Author

Shuweis commented May 9, 2024

 怀疑可能是sharecaptioner的inference环境和sharegpt4v-7B不一样?我的后者模型是可以正常运行的,但是前者不行

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants