Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

馃殌 [Feature]: Add support for Prompt Templates #223

Open
ghost opened this issue Apr 25, 2023 · 3 comments
Open

馃殌 [Feature]: Add support for Prompt Templates #223

ghost opened this issue Apr 25, 2023 · 3 comments

Comments

@ghost
Copy link

ghost commented Apr 25, 2023

I'm pretty sure the Open-Assistant models added in 17308a8 require different labels than the normal llama models.
I couldn't find anything on their llama huggingface repo, but in https://huggingface.co/OpenAssistant/stablelm-7b-sft-v7-epoch-3 I found this:

Prompting

Two special tokens are used to mark the beginning of user and assistant turns: <|prompter|> and <|assistant|>. Each turn ends with a <|endoftext|> token.

And I strongly believe this is the case for the llama version as well, because when I played around with it, it sometimes outputted those strings.

@gaby
Copy link
Member

gaby commented Aug 14, 2023

@pabl-o-ce Is this the stuff you mentioned on Discord?

@pabl-o-ce
Copy link
Contributor

yes :)

@k0gen
Copy link
Contributor

k0gen commented Sep 7, 2023

If you're referring to renaming chats with a custom name, that would be a must-have feature.

@gaby gaby changed the title Allow user to change Prompt labels Add support for Prompt Templates Nov 14, 2023
@gaby gaby changed the title Add support for Prompt Templates 馃殌 [Feature]: Add support for Prompt Templates Jan 2, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants