Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

system messages don't work #199

Closed
ueartificial opened this issue Apr 27, 2024 · 5 comments
Closed

system messages don't work #199

ueartificial opened this issue Apr 27, 2024 · 5 comments

Comments

@ueartificial
Copy link

System messages don't work. I tried both gemini-1.5 and 1.0. I also tried convert_system_message_to_human=True and convert_system_message_to_human=False. None of these work, it doesn't follow system message.

@lkuligin
Copy link
Collaborator

can you provide a reproducible example, please?

@ueartificial
Copy link
Author

ueartificial commented Apr 28, 2024

import os
from dotenv import load_dotenv
load_dotenv()

from langchain_core.prompts import ChatPromptTemplate
from langchain_core.output_parsers import StrOutputParser

from langchain_google_genai import ChatGoogleGenerativeAI 

llm = ChatGoogleGenerativeAI(model="gemini-1.5-pro-latest", google_api_key=os.getenv('gemini_api_key'))
system = "Your job is to generate a video title for a youtube channel with given topic. Title should be under 150 characters. Only answer with title, nothing else."
prompt = ChatPromptTemplate.from_messages([("system", system), ("human", "Topic: {topic}")])
chain = prompt | llm | StrOutputParser()

response = chain.invoke({"topic": "Music Instruments"})

print(response)

it prints a very long essay(around 3000 characters) about music instruments instead of a title I asked for. I have langchain-google-genai 1.0.3 installed btw. it doesn't change if I add convert_system_message_to_human=True while creating llm.

@lkuligin
Copy link
Collaborator

You're right, I'm sorry about this, let me prepare a fix.

@ueartificial
Copy link
Author

Thank you! It works now by adding client = self.client as in the fix when using "gemini-1.5-pro-latest".

But it gives an error when using model "gemini-1.0-pro-latest" since system prompts are not supported with that model. In that case, using convert_system_message_to_human=True doesn't prevent the error.

@lkuligin
Copy link
Collaborator

It should be fixed now, please feel free to re-open if it's not.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants