Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

cli_demo.py 切换Baichuan-13B-Base 问答异常 #176

Open
cgq0816 opened this issue Sep 6, 2023 · 2 comments
Open

cli_demo.py 切换Baichuan-13B-Base 问答异常 #176

cgq0816 opened this issue Sep 6, 2023 · 2 comments

Comments

@cgq0816
Copy link

cgq0816 commented Sep 6, 2023

加载模型方式
config = GenerationConfig.from_pretrained(
"/data/model/Baichuan-13B-Base"
)
model = AutoModelForCausalLM.from_pretrained(
"/data/model/Baichuan-13B-Base",
torch_dtype=torch.float16,
device_map="auto",
trust_remote_code=True
)
model.generation_config = config
tokenizer = AutoTokenizer.from_pretrained(
"/data/model/Baichuan-13B-Base",
use_fast=False,
trust_remote_code=True
)
也按照其他人issues的问题进行了配置
image
但是还是会报如下错误
image
请问我该如何使用Baichuan-13B-Base做问答呢?

@ltmAliCloud
Copy link

我也报了相同的错,请问有解决方法吗~

@yangbiaoqiange
Copy link

同问,如何用basemodel做问答呢?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants