Replies: 1 comment 1 reply
-
did you manage to solve this? |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
did you manage to solve this? |
Beta Was this translation helpful? Give feedback.
-
I would like to stream the answer using a custom callback handler. The way I do that with Langchain is by creating a
CustomHandler
that extendsStreamingStdOutCallbackHandler
, and I redefineon_llm_new_token
to emit each token to socket.io, and setting it in theChatOpenAI
constructor like this:llm = ChatOpenAI(streaming=True, callback_manager=CallbackManager([handler]), verbose=True)
wherehandler
is an instance of theCustomHandler
I have tried doing the same with Langflow, by passing the parameter in the tweaks (using
load_flow_from_json()
), but when I debug the flow, thecallback_manager
parameter is emptyIs there a way yet to add that parameter using the tweaks or can't I use streaming using a custom handler yet?
Here is my tweaks declaration
Using langflow-0.2.7
Update:
I have formatted a generated JSON to read it, and I have noticed this:
So, I have tried adding my custom CallbackHandler in the tweaks, but when I debug the flow, the callbacks is an empty list
And when I
print(flow)
, this is the outputI want to note that by adding
'streaming'
into the tweaks, I was able to change its value in the ChatOpenAI llm object, but I'm not able to do so with either thecallbacks
or thecallback_manager
valuesI am not sure if I'm supposed to add it this way, or in the
model_kwargs
tweak option?Beta Was this translation helpful? Give feedback.
All reactions