Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

采用最新版本的metagpt运行实例demo异常,异常信息为openai.types.completion_usage.CompletionUsage() argument after ** must be a mapping, not NoneType #1252

Open
guyouyue opened this issue May 8, 2024 · 10 comments

Comments

@guyouyue
Copy link

guyouyue commented May 8, 2024

C:\Users\glh19\MetaGPT\MetaGPT\Scripts\python.exe E:\code\python\MetaGPT\examples\di\custom_tool.py
2024-05-08 16:55:41.973 | INFO | metagpt.const:get_metagpt_package_root:29 - Package root set to E:\code\python\MetaGPT

[
    {
        "task_id": "1",
        "dependent_task_ids": [],
        "instruction": "Call the magic function with arguments 'A' and 2, then provide the result.",
        "task_type": "other"
    }
]
```2024-05-08 16:56:09.817 | WARNING  | metagpt.utils.common:wrapper:649 - There is a exception in role's execution, in order to resume, we delete the newest role communication message in the role's memory.
Traceback (most recent call last):
  File "E:\code\python\MetaGPT\metagpt\utils\common.py", line 640, in wrapper
    return await func(self, *args, **kwargs)
  File "E:\code\python\MetaGPT\metagpt\roles\role.py", line 555, in run
    rsp = await self.react()
  File "E:\code\python\MetaGPT\metagpt\roles\role.py", line 526, in react
    rsp = await self._plan_and_act()
  File "E:\code\python\MetaGPT\metagpt\roles\di\data_interpreter.py", line 95, in _plan_and_act
    raise e
  File "E:\code\python\MetaGPT\metagpt\roles\di\data_interpreter.py", line 90, in _plan_and_act
    rsp = await super()._plan_and_act()
  File "E:\code\python\MetaGPT\metagpt\roles\role.py", line 486, in _plan_and_act
    await self.planner.update_plan(goal=goal)
  File "E:\code\python\MetaGPT\metagpt\strategy\planner.py", line 75, in update_plan
    rsp = await WritePlan().run(context, max_tasks=max_tasks)
  File "E:\code\python\MetaGPT\metagpt\actions\di\write_plan.py", line 49, in run
    rsp = await self._aask(prompt)
  File "E:\code\python\MetaGPT\metagpt\actions\action.py", line 93, in _aask
    return await self.llm.aask(prompt, system_msgs)
  File "E:\code\python\MetaGPT\metagpt\provider\base_llm.py", line 150, in aask
    rsp = await self.acompletion_text(message, stream=stream, timeout=self.get_timeout(timeout))
  File "C:\Users\glh19\MetaGPT\MetaGPT\lib\site-packages\tenacity\_asyncio.py", line 88, in async_wrapped
    return await fn(*args, **kwargs)
  File "C:\Users\glh19\MetaGPT\MetaGPT\lib\site-packages\tenacity\_asyncio.py", line 47, in __call__
    do = self.iter(retry_state=retry_state)
  File "C:\Users\glh19\MetaGPT\MetaGPT\lib\site-packages\tenacity\__init__.py", line 314, in iter
    return fut.result()
  File "D:\Develop\python\lib\concurrent\futures\_base.py", line 451, in result
    return self.__get_result()
  File "D:\Develop\python\lib\concurrent\futures\_base.py", line 403, in __get_result
    raise self._exception
  File "C:\Users\glh19\MetaGPT\MetaGPT\lib\site-packages\tenacity\_asyncio.py", line 50, in __call__
    result = await fn(*args, **kwargs)
  File "E:\code\python\MetaGPT\metagpt\provider\openai_api.py", line 155, in acompletion_text
    return await self._achat_completion_stream(messages, timeout=timeout)
  File "E:\code\python\MetaGPT\metagpt\provider\openai_api.py", line 105, in _achat_completion_stream
    usage = CompletionUsage(**chunk.usage)
TypeError: openai.types.completion_usage.CompletionUsage() argument after ** must be a mapping, not NoneType

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "E:\code\python\MetaGPT\examples\di\custom_tool.py", line 36, in <module>
    asyncio.run(main())
  File "D:\Develop\python\lib\asyncio\runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "D:\Develop\python\lib\asyncio\base_events.py", line 649, in run_until_complete
    return future.result()
  File "E:\code\python\MetaGPT\examples\di\custom_tool.py", line 30, in main
    await di.run("Just call the magic function with arg1 'A' and arg2 2. Tell me the result.")
  File "E:\code\python\MetaGPT\metagpt\utils\common.py", line 662, in wrapper
    raise Exception(format_trackback_info(limit=None))
Exception: Traceback (most recent call last):
  File "E:\code\python\MetaGPT\metagpt\utils\common.py", line 640, in wrapper
    return await func(self, *args, **kwargs)
  File "E:\code\python\MetaGPT\metagpt\roles\role.py", line 555, in run
    rsp = await self.react()
  File "E:\code\python\MetaGPT\metagpt\roles\role.py", line 526, in react
    rsp = await self._plan_and_act()
  File "E:\code\python\MetaGPT\metagpt\roles\di\data_interpreter.py", line 95, in _plan_and_act
    raise e
  File "E:\code\python\MetaGPT\metagpt\roles\di\data_interpreter.py", line 90, in _plan_and_act
    rsp = await super()._plan_and_act()
  File "E:\code\python\MetaGPT\metagpt\roles\role.py", line 486, in _plan_and_act
    await self.planner.update_plan(goal=goal)
  File "E:\code\python\MetaGPT\metagpt\strategy\planner.py", line 75, in update_plan
    rsp = await WritePlan().run(context, max_tasks=max_tasks)
  File "E:\code\python\MetaGPT\metagpt\actions\di\write_plan.py", line 49, in run
    rsp = await self._aask(prompt)
  File "E:\code\python\MetaGPT\metagpt\actions\action.py", line 93, in _aask
    return await self.llm.aask(prompt, system_msgs)
  File "E:\code\python\MetaGPT\metagpt\provider\base_llm.py", line 150, in aask
    rsp = await self.acompletion_text(message, stream=stream, timeout=self.get_timeout(timeout))
  File "C:\Users\glh19\MetaGPT\MetaGPT\lib\site-packages\tenacity\_asyncio.py", line 88, in async_wrapped
    return await fn(*args, **kwargs)
  File "C:\Users\glh19\MetaGPT\MetaGPT\lib\site-packages\tenacity\_asyncio.py", line 47, in __call__
    do = self.iter(retry_state=retry_state)
  File "C:\Users\glh19\MetaGPT\MetaGPT\lib\site-packages\tenacity\__init__.py", line 314, in iter
    return fut.result()
  File "D:\Develop\python\lib\concurrent\futures\_base.py", line 451, in result
    return self.__get_result()
  File "D:\Develop\python\lib\concurrent\futures\_base.py", line 403, in __get_result
    raise self._exception
  File "C:\Users\glh19\MetaGPT\MetaGPT\lib\site-packages\tenacity\_asyncio.py", line 50, in __call__
    result = await fn(*args, **kwargs)
  File "E:\code\python\MetaGPT\metagpt\provider\openai_api.py", line 155, in acompletion_text
    return await self._achat_completion_stream(messages, timeout=timeout)
  File "E:\code\python\MetaGPT\metagpt\provider\openai_api.py", line 105, in _achat_completion_stream
    usage = CompletionUsage(**chunk.usage)
TypeError: openai.types.completion_usage.CompletionUsage() argument after ** must be a mapping, not NoneType


进程已结束,退出代码为 1
@guyouyue
Copy link
Author

guyouyue commented May 8, 2024

目前分析是在
async def _achat_completion_stream(self, messages: list[dict], timeout=USE_CONFIG_TIMEOUT) -> str:
response: AsyncStream[ChatCompletionChunk] = await self.aclient.chat.completions.create(
**self._cons_kwargs(messages, timeout=self.get_timeout(timeout)), stream=True
)
usage = None
collected_messages = []
async for chunk in response:
chunk_message = chunk.choices[0].delta.content or "" if chunk.choices else "" # extract the message
finish_reason = (
chunk.choices[0].finish_reason if chunk.choices and hasattr(chunk.choices[0], "finish_reason") else None
)
log_llm_stream(chunk_message)
collected_messages.append(chunk_message)
if finish_reason:
if hasattr(chunk, "usage"):
# Some services have usage as an attribute of the chunk, such as Fireworks
usage = CompletionUsage(**chunk.usage)
elif hasattr(chunk.choices[0], "usage") :
# The usage of some services is an attribute of chunk.choices[0], such as Moonshot
usage = CompletionUsage(**chunk.choices[0].usage)
elif "openrouter.ai" in self.config.base_url:
# due to it get token cost from api
usage = await get_openrouter_tokens(chunk)
中的if hasattr(chunk, "usage")和elif hasattr(chunk.choices[0], "usage")没有判空导致的

@aa875982361
Copy link

i have a some problem

@huangpan2507
Copy link

yes, I have the same problem with metagpt version=0.8.1, also happend again after i use pip install metagpt==0.7.1

@nzk1912
Copy link

nzk1912 commented May 9, 2024

me too. bug report #1250

@CHENZHEN3078
Copy link

yeah, I have the same problem

C:\Users\NEZ1SGH\AppData\Local\anaconda3\envs\myenv\python.exe C:\Users\NEZ1SGH\metagpt\MetaGPT\examples\hello_world.py
2024-05-10 16:33:10.087 | INFO | metagpt.const:get_metagpt_package_root:29 - Package root set to C:\Users\NEZ1SGH\metagpt\MetaGPT
2024-05-10 16:33:16.477 | INFO | main:ask_and_print:15 - Q: what's your name?
I am an AI assistant and you can call me Assistant. How can I help you today?Traceback (most recent call last):
File "C:\Users\NEZ1SGH\metagpt\MetaGPT\examples\hello_world.py", line 45, in
asyncio.run(main())
File "C:\Users\NEZ1SGH\AppData\Local\anaconda3\envs\myenv\lib\asyncio\runners.py", line 44, in run
return loop.run_until_complete(main)
File "C:\Users\NEZ1SGH\AppData\Local\anaconda3\envs\myenv\lib\asyncio\base_events.py", line 647, in run_until_complete
return future.result()
File "C:\Users\NEZ1SGH\metagpt\MetaGPT\examples\hello_world.py", line 39, in main
await ask_and_print("what's your name?", llm, "I'm a helpful AI assistant.")
File "C:\Users\NEZ1SGH\metagpt\MetaGPT\examples\hello_world.py", line 16, in ask_and_print
rsp = await llm.aask(question, system_msgs=[system_prompt])
File "C:\Users\NEZ1SGH\metagpt\MetaGPT\metagpt\provider\base_llm.py", line 150, in aask
rsp = await self.acompletion_text(message, stream=stream, timeout=self.get_timeout(timeout))
File "C:\Users\NEZ1SGH\AppData\Local\anaconda3\envs\myenv\lib\site-packages\tenacity_asyncio.py", line 88, in async_wrapped
return await fn(*args, **kwargs)
File "C:\Users\NEZ1SGH\AppData\Local\anaconda3\envs\myenv\lib\site-packages\tenacity_asyncio.py", line 47, in call
do = self.iter(retry_state=retry_state)
File "C:\Users\NEZ1SGH\AppData\Local\anaconda3\envs\myenv\lib\site-packages\tenacity_init_.py", line 314, in iter
return fut.result()
File "C:\Users\NEZ1SGH\AppData\Local\anaconda3\envs\myenv\lib\concurrent\futures_base.py", line 439, in result
return self.__get_result()
File "C:\Users\NEZ1SGH\AppData\Local\anaconda3\envs\myenv\lib\concurrent\futures_base.py", line 391, in __get_result
raise self._exception
File "C:\Users\NEZ1SGH\AppData\Local\anaconda3\envs\myenv\lib\site-packages\tenacity_asyncio.py", line 50, in call
result = await fn(*args, **kwargs)
File "C:\Users\NEZ1SGH\metagpt\MetaGPT\metagpt\provider\openai_api.py", line 155, in acompletion_text
return await self._achat_completion_stream(messages, timeout=timeout)
File "C:\Users\NEZ1SGH\metagpt\MetaGPT\metagpt\provider\openai_api.py", line 105, in _achat_completion_stream
usage = CompletionUsage(**chunk.usage)
TypeError: openai.types.completion_usage.CompletionUsage() argument after ** must be a mapping, not NoneType

@hs3180
Copy link

hs3180 commented May 11, 2024

Add a None filter at this line can avoid this exception.

if hasattr(chunk, "usage") and chunk.usage is not None:

if hasattr(chunk, "usage"):

@baixiaolu
Copy link

我的临时解决办法:
MetaGPT/metagpt/provider/openai_api.py文件102行左右和usage计算相关的代码注释掉。共3块代码:

  1. 102-111行:if finish_reason:开头的整块代码,全部注释。
  2. 115-117行:if not usage:开头的整块代码,全部注释。
  3. 119行:self._update_costs(usage),注释1行。
    虽然失去了计算usage的功能,但代码可正常运行了。

注意,如果很依赖usage的计算,慎重采用本方法。

@chg0901
Copy link

chg0901 commented May 13, 2024

Add a None filter at this line can avoid this exception.

if hasattr(chunk, "usage") and chunk.usage is not None:

if hasattr(chunk, "usage"):

One of my friend use this methods and it worked but not for me.

@chg0901
Copy link

chg0901 commented May 13, 2024

我的临时解决办法: MetaGPT/metagpt/provider/openai_api.py文件102行左右和usage计算相关的代码注释掉。共3块代码:

  1. 102-111行:if finish_reason:开头的整块代码,全部注释。
  2. 115-117行:if not usage:开头的整块代码,全部注释。
  3. 119行:self._update_costs(usage),注释1行。
    虽然失去了计算usage的功能,但代码可正常运行了。

注意,如果很依赖usage的计算,慎重采用本方法。

This works for me, I use the api of DeepSeek, a way similar to the api of openai

llm:
  api_type: "openai" 
  model: "deepseek-chat"  
  base_url: "https://api.deepseek.com" 
  api_key: ''

@garylin2099
Copy link
Collaborator

Hey guys, thanks for reporting the issue. For the latest main branch, it should be fixed in #1253 . For pypi package metagpt==0.8.1, try using openai==1.6.1 as indicated in the requirement.txt, the issue is probably due to the openai package upgrade. Please check if this resolves the issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

9 participants