Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to make streaming calls to an already established Assistant? #85

Open
TomZhou2024 opened this issue Apr 2, 2024 · 7 comments
Open
Assignees
Labels
help wanted Extra attention is needed

Comments

@TomZhou2024
Copy link

How can I use the TaskingAI Python SDK to make streaming calls to an already established Assistant? Currently, I am using the following code:

Start an async chat completion task with streaming

import time
import torch
import uvicorn
from pydantic import BaseModel, Field
from fastapi import FastAPI, HTTPException
from fastapi.middleware.cors import CORSMiddleware
from contextlib import asynccontextmanager
from typing import Any, Dict, List, Literal, Optional, Union
from transformers import AutoTokenizer, AutoModel, AutoModelForCausalLM

from taskingai.inference import a_chat_completion
from taskingai.inference import UserMessage, SystemMessage
from enum import Enum
import taskingai
from taskingai.assistant import Assistant
from taskingai.assistant.message import Message

class MessageRole(str, Enum):
USER = "user"
ASSISTANT = "assistant"

class MessageChunk(BaseModel):
role: MessageRole = Field(...)
index: int = Field(...)
delta: str = Field(...)
created_timestamp: int = Field(..., ge=0)

taskingai.init(api_key='tkOb2nWY19LJq2IpyXRnH82YRL4tv2sD', host='https://taskai.bnuzh.free.hr')

assistant: Assistant = taskingai.assistant.get_assistant(
assistant_id="X5lMiuMf1AspX3GBKm1YIVN0"
)
chat = taskingai.assistant.create_chat(
assistant_id=assistant.assistant_id,
)
assistant_message_response = taskingai.assistant.generate_message(
assistant_id=assistant.assistant_id,
chat_id=chat.chat_id,
stream=True,
)

print(f"Assistant:", end=" ", flush=True)
for item in assistant_message_response:
if isinstance(item, MessageChunk):
print(item.delta, end="", flush=True)

@TomZhou2024 TomZhou2024 changed the title make streaming calls to an already established Assistant? How to make streaming calls to an already established Assistant? Apr 2, 2024
@jameszyao
Copy link
Contributor

print(f"Assistant:", end=" ", flush=True)
for item in assistant_message_response:
if isinstance(item, MessageChunk):
print(item.delta, end="", flush=True)

The code appears to be mostly correct to me. You can substitute print with your own code to display or forward the message chunk.

@jameszyao jameszyao self-assigned this Apr 2, 2024
@jameszyao jameszyao added the help wanted Extra attention is needed label Apr 2, 2024
@Hapluckyy
Copy link

could you provide me an example code about how to make streaming calls to an already established Assistant?
my code still cant output anything,and there is no error.
image

@Hapluckyy
Copy link

here is my own code,write by myself

import time
import torch
import uvicorn
from pydantic import BaseModel, Field
from fastapi import FastAPI, HTTPException
from fastapi.middleware.cors import CORSMiddleware
from contextlib import asynccontextmanager
from typing import Any, Dict, List, Literal, Optional, Union
from transformers import AutoTokenizer, AutoModel, AutoModelForCausalLM

from taskingai.inference import a_chat_completion
from taskingai.inference import UserMessage, SystemMessage
from enum import Enum
import taskingai
from taskingai.assistant import Assistant
from taskingai.assistant.message import Message

class MessageRole(str, Enum):
    USER = "user"
    ASSISTANT = "assistant"

class MessageChunk(BaseModel):
    role: MessageRole = Field(...)
    index: int = Field(...)
    delta: str = Field(...)
    created_timestamp: int = Field(..., ge=0)

taskingai.init(api_key='', host='')

assistant: Assistant = taskingai.assistant.get_assistant(
    assistant_id="X5lMTTrPQuhuKx4xLIziAAnG"
)

chat = taskingai.assistant.create_chat(
    assistant_id=assistant.assistant_id,
)
taskingai.assistant.create_message(
    assistant_id= assistant.assistant_id,
    chat_id=chat.chat_id,
    text = "who are you?"
)
assistant_message_response = taskingai.assistant.generate_message(
    assistant_id=assistant.assistant_id,
    chat_id=chat.chat_id,
    stream=True,
)
print(assistant_message_response)
# print(f"Assistant:", end=" ", flush=True)
for item in assistant_message_response:
    if isinstance(item, MessageChunk):
        #print(111)
        print(item.delta, end="", flush=True)

@Hapluckyy
Copy link

is there something wrong with my MessageChunk class?

@Hapluckyy
Copy link

If I remove the following conditional statement, the code block will be able to return the expected result for the given question.

if isinstance(item, MessageChunk):

However, it seems that the result is not returned in a streaming manner. I need to wait for a period of time and then receive all the items at once before printing them.

@DynamesC
Copy link
Collaborator

DynamesC commented Apr 15, 2024

@Hapluckyy @TomZhou2024
This is because your MessageChunk is defined locally, not by TaskingAI Client SDK. And it is impossible for our Client SDK to return objects defined by our user.

assistant_message_response in your code is anStream object, the correct way to import MessageChunk is from taskingai.assistant import MessageChunk. You should use this rather than writing your own.

@Hapluckyy
Copy link

much appreciated!
problem sloved!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
help wanted Extra attention is needed
Projects
None yet
Development

No branches or pull requests

4 participants