Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Custom commands need to be written twice. #1272

Closed
3 tasks done
eladamittai opened this issue May 12, 2024 · 5 comments
Closed
3 tasks done

Custom commands need to be written twice. #1272

eladamittai opened this issue May 12, 2024 · 5 comments
Assignees
Labels
bug Something isn't working

Comments

@eladamittai
Copy link

eladamittai commented May 12, 2024

Before submitting your bug report

Relevant environment info

- OS: windows 10
- Continue: 0.9.126
- IDE: vs code 1.86.2
- model: deepseek 33b

Description

I've written a few / commands like logs and ut, and all of the commands don't work and the model just describes the code. In the same chat, the first time I use the command, I receive a code explanation, and when I send it again it works. The same prompt and model work for jetbrains.

To reproduce

Here is the config:

{
	"models": [
		{
			"title": "Deepseek",
			"provider": "openai",
			"model": "deepseek-33b-instruct:,
			"apiBase": "http://my-api-base/v1",
			"systemMessage": "you are an AI programming code-completion assistent, utilizing the deepseek coder model. you answer programming related questions.",
			"apikey": "my-apikey",
			"contextLength": 16000
		}
	]
	"completeionOptions": {
		"temperature": 0.3
	}
	"customCommands": [
		{
			"name": "ut",
			"prompt": "{{{ input }}}\n\nAdd logs for thr selected code. Give the new code just as chat output, don't edit any file.",
			"description": "Add logs to highlighted code"
		}
	]
}

Log output

No response

@eladamittai eladamittai added the bug Something isn't working label May 12, 2024
@sestinj
Copy link
Contributor

sestinj commented May 13, 2024

@eladamittai I notice that there is a colon rather than a quote in your model name ("model": "deepseek-33b-instruct:,). Was this just a typo when copying over to GitHub issues?

Assuming that's not causing the config to not load, the first thing I would check is the prompt logs: https://docs.continue.dev/troubleshooting#llm-prompt-logs. This will tell us whether the instructions are being entirely left out, or if the problem is something else.

I also would recommend removing your system prompt. This is a detail particular to deepseek models, but they automatically have a system prompt very similar to yours, and using a different one might be causing it to act strange

@sestinj sestinj self-assigned this May 13, 2024
@eladamittai
Copy link
Author

eladamittai commented May 14, 2024

@sestinj hey, thanks for the quick reply! 🙏
I removed the system prompt and checked the logs, and the entire command is being left out of the context. Instead of the command, it just puts /log before the context.
I also tried again with just writing the command in the chat instead of using the slash command, and it worked fine. Same with the /edit command. So it's only for the custom commands.

@sestinj
Copy link
Contributor

sestinj commented May 14, 2024

Got it! I think this is related to a fix I made just yesterday. I believe it should be available in the latest pre-release, 0.9.130

@eladamittai
Copy link
Author

@sestinj great! I'll check it out. Thanks!

@eladamittai
Copy link
Author

It works!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants