Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Inquiry on Inserting P-Tuning Soft Prompts at Any Position in Input #1669

Closed
2 of 4 tasks
Betty1202 opened this issue Apr 23, 2024 · 2 comments
Closed
2 of 4 tasks

Inquiry on Inserting P-Tuning Soft Prompts at Any Position in Input #1669

Betty1202 opened this issue Apr 23, 2024 · 2 comments

Comments

@Betty1202
Copy link

Betty1202 commented Apr 23, 2024

System Info

None

Who can help?

No response

Information

  • The official example scripts
  • My own modified scripts

Tasks

  • An officially supported task in the examples folder
  • My own task or dataset (give details below)

Reproduction

None

Expected behavior

How to insert P-tuning soft prompt into the input at any position (not just the beginning). As mentioned in your document at https://huggingface.co/docs/peft/main/en/package_reference/p_tuning, "The prompt tokens can be added anywhere in the input sequence." However, I didn't find any parameters in PromptEncoderConfig or PromptEncoder that can be set for this purpose. Could you please advise on how to set this up?

@BenjaminBossan
Copy link
Member

Ping @pacman100.

Copy link

This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.

@github-actions github-actions bot closed this as completed Jun 1, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants