请问如何正确撰写Prompt Template? #2454
Replies: 1 comment 4 replies
-
🤖 你好,@ChiNoel-osu!我是Dosu,一个可以帮助你解决问题、回答疑惑并成为贡献者的机器人。虽然我是一个机器人,但我在此为你提供帮助,期间我们也等待人类的维护者。我期待与你开展友好、有效的交流。 您在撰写Prompt Template时,需要遵循Jinja2语法,使用双大括号
知识库和搜索引擎对话支持的变量有:
Agent对话支持的变量有:
以下是一个zephyr-7B-beta-AWQ模型的Prompt Template示例: PROMPT_TEMPLATES = {
"zephyr_7B_beta_AWQ": {
"default":
'You are a knowledgeable AI with access to a vast amount of information. '
'Please answer the following question to the best of your ability:\n\n'
'Question: {{ question }}\n'
'Context: {{ context }}\n',
"with_history":
'You are a knowledgeable AI with access to a vast amount of information. '
'You are currently in a conversation with a human. Here is the conversation history:\n\n'
'{history}\n'
'Human: {input}\n'
'AI:',
"py":
'You are a smart code assistant, please write me some simple py code. \n'
'{{ input }}',
},
} 在这个示例中,"default"模板在没有对话历史时使用,"with_history"模板在有对话历史时使用,"py"模板在用户请求Python代码时使用。 请将"zephyr_7B_beta_AWQ"替换为您实际使用的模型名称。 参考资料: 如果您在使用过程中仍有问题,欢迎继续提问。 SourcesThis response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot. |
Beta Was this translation helpful? Give feedback.
-
我使用的模型是 zephyr-7B-beta-AWQ
它的Prompt Template是长这样的:
我在
configs\prompt_config.py
中加入了以下的Prompt:但是LLM的回答好像不太对劲,前面有时还会多一个"{ output }",我感觉可能是Prompt没搞对。
我也试过了知识库问答:
召回文本是正确的,但回答效果跟我用的其他项目相比差了很多,请问各位这块我是哪里搞错了吗😪
Beta Was this translation helpful? Give feedback.
All reactions