Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Selecting any of the trending memecoins throws Cannot destructure property 'role' error #325

Open
athrael-soju opened this issue May 3, 2024 · 20 comments

Comments

@athrael-soju
Copy link

athrael-soju commented May 3, 2024

@jeremyphilemon although #324 resolves the majority of issues from using streamUI over render, this one still remains.

Simply get the trending memecoins to show and select one of them. Error attached.

image

@prashantbhudwal
Copy link

prashantbhudwal commented May 4, 2024

@jeremyphilemon @jaredpalmer

the messages sent to openai are not formatted properly.

Invalid parameter: messages with role 'tool' must be a response to a preceeding message with 'tool_calls'.

I was testing. This works.

const toolCallMessage: CoreMessage = {
  role: "assistant",
  content: [
    {
      type: "tool-call",
      toolName: "showStockPurchase",
      toolCallId: "8Bb6oJ1vAIRuHSIAVYmAp",
      args: z.object({
        symbol: z.string(),
        price: z.number(),
        defaultAmount: z.number(),
      }),
    },
  ],
};

const toolMessage: CoreMessage = {
  role: "tool",
  content: [
    {
      result: JSON.stringify({
        symbol: "AAPL",
        price: 150,
        defaultAmount: 100,
        status: "completed",
      }),
      type: "tool-result",
      toolCallId: "8Bb6oJ1vAIRuHSIAVYmAp",
      toolName: "showStockPurchase",
    },
  ],
};

  const all = [...messages, toolCallMessage, toolMessage];

@athrael-soju
Copy link
Author

@prashantbhudwal Could you raise a PR please?

@prashantbhudwal
Copy link

@jeremyphilemon @jaredpalmer

the messages sent to openai are not formatted properly.

Invalid parameter: messages with role 'tool' must be a response to a preceeding message with 'tool_calls'.

I was testing. This works.

const toolCallMessage: CoreMessage = {
  role: "assistant",
  content: [
    {
      type: "tool-call",
      toolName: "showStockPurchase",
      toolCallId: "8Bb6oJ1vAIRuHSIAVYmAp",
      args: z.object({
        symbol: z.string(),
        price: z.number(),
        defaultAmount: z.number(),
      }),
    },
  ],
};

const toolMessage: CoreMessage = {
  role: "tool",
  content: [
    {
      result: JSON.stringify({
        symbol: "AAPL",
        price: 150,
        defaultAmount: 100,
        status: "completed",
      }),
      type: "tool-result",
      toolCallId: "8Bb6oJ1vAIRuHSIAVYmAp",
      toolName: "showStockPurchase",
    },
  ],
};

  const all = [...messages, toolCallMessage, toolMessage];

@athrael-soju
This is not a solution, just a problem description. I am still learning this. So, I don't think I am the best person for the PR.

@hemik000
Copy link

hemik000 commented May 5, 2024

Same error. Any solution?

@Spectralgo
Copy link

Spectralgo commented May 5, 2024

Same kind of error here with this error message: Cannot destructure property 'role' of '.for' as it is undefined. at convertToOpenAIChatMessage

Can't have the app reply after showing me a card.

image

Time for me to study some docs 📃 and try some implementations !! GLHF

UPDATE:

I've tried to check out to the previous commit and the bug disapears:
image

@hemik000
Copy link

hemik000 commented May 5, 2024

Because the new SDK uses streamUI

const result = await streamUI({

And old one uses a render

const ui = render({

@athrael-soju
Copy link
Author

Because the new SDK uses streamUI

const result = await streamUI({

And old one uses a render

const ui = render({

The bug appears when using streamUI, not render. If you revert to the previous commit it works fine with render.

@DanielhCarranza
Copy link

I have the same error after clicking on the first generated component, I update everything even nextjs but it didn't work

@fullstackwebdev
Copy link

Yes, this is an issue with the new code. I guess it should be reverted or fixed? I tried a few things to resolve it, but wasn't able to.

@JoseAngelChepo
Copy link

The latest update "streamUI instead of render #324" does not allow messages with "role: system" between conversation (only at start) with new prompt param system.

The "problem" is:
The functions also uses messages with "role: system" to register the actions or events from the user in the context conversation.

The error Cannot destructure property 'role' error is consequence of the function below that return undefined for "role: system"

This function is inside of dependency "ai": "^3.1.1"

// core/prompt/convert-to-language-model-prompt.ts
function convertToLanguageModelPrompt(prompt) {
  const languageModelMessages = [];
  if (prompt.system != null) {
    languageModelMessages.push({ role: "system", content: prompt.system });
  }
  switch (prompt.type) {
    case "prompt": {
      languageModelMessages.push({
        role: "user",
        content: [{ type: "text", text: prompt.prompt }]
      });
      break;
    }
    case "messages": {
      languageModelMessages.push(
        ...prompt.messages.map((message) => {
          switch (message.role) {
            case "user": {
              if (typeof message.content === "string") {
                return {
                  role: "user",
                  content: [{ type: "text", text: message.content }]
                };
              }
              return {
                role: "user",
                content: message.content.map(
                  (part) => {
                    var _a;
                    switch (part.type) {
                      case "text": {
                        return part;
                      }
                      case "image": {
                        if (part.image instanceof URL) {
                          return {
                            type: "image",
                            image: part.image,
                            mimeType: part.mimeType
                          };
                        }
                        const imageUint8 = convertDataContentToUint8Array(
                          part.image
                        );
                        return {
                          type: "image",
                          image: imageUint8,
                          mimeType: (_a = part.mimeType) != null ? _a : detectImageMimeType(imageUint8)
                        };
                      }
                    }
                  }
                )
              };
            }
            case "assistant": {
              if (typeof message.content === "string") {
                return {
                  role: "assistant",
                  content: [{ type: "text", text: message.content }]
                };
              }
              return { role: "assistant", content: message.content };
            }
            case "tool": {
              return message;
            }
          }
        })
      );
      break;
    }
    default: {
      const _exhaustiveCheck = prompt;
      throw new Error(`Unsupported prompt type: ${_exhaustiveCheck}`);
    }
  }
  return languageModelMessages;
}

I tested in compiled module inside node_modules/ai/rsc/dist/rsc-server.msj function convertToLanguageModelPrompt by adding

case "system": {
    return message;
}

Result:
Captura de pantalla 2024-05-06 a la(s) 2 01 11 p m

This works but the correction must be made in the repository of the dependency "ai": "^3.1.1".

And this depends on whether it is considered good practice to send system messages inside of the conversation to save new context (user events)

@kevb10
Copy link

kevb10 commented May 8, 2024

The above didn't solve my problem. There is a phantom object that is undefined on my end. And that's causing the error
image

Obviously a little sanity check addresses this but I'm curious where and why an operation returns undefined in the first place and fix it there instead.
image

@JoseAngelChepo
Copy link

JoseAngelChepo commented May 8, 2024

@kevb10 you can track your messages in the function streamUI locate in node_modules/ai/rsc/dist/rsc-server.mjs

Captura de pantalla 2024-05-08 a la(s) 11 23 25 a m

Check messages before and after

Captura de pantalla 2024-05-08 a la(s) 11 27 36 a m

And you can do testing with the function convertToLanguageModelPrompt(validatedPrompt) to check what message return undefined

P.S. remember that you are in the dependency

@Spectralgo
Copy link

Spectralgo commented May 8, 2024

Thanks @kevb10. It didn't solve my problem either, but it's nice to see progress on this issue

Update: I implemented your sanity check and it works for me.

@rbenhase
Copy link

rbenhase commented May 8, 2024

@kevb10's sanity check also works for me, using patch-package to create a patch file instead of making edits inside of node_modules (this way, I won't lose my changes as soon as I run npm update or npm install). Still not an ideal solution, obviously, but good enough for the time being.

Just ran
npm install patch-package postinstall-postinstall --save-dev

Followed by:
npx patch-package @ai-sdk/openai

And the issue went away. Still interested in a more permanent solution, though.

hemik000 added a commit to hemik000/ai-chatbot that referenced this issue May 9, 2024
@prashantbhudwal
Copy link

Can anyone please explain why https://chat.vercel.ai/ works perfectly fine, but the main branch on localhost does not?

Isn't the main branch from the repo deployed to this domain?

@hemik000
Copy link

hemik000 commented May 9, 2024

Maybe because that might point to some old commit.

@ar-radcliff
Copy link

ar-radcliff commented May 10, 2024

Just noting that I'm also running into this bug and have been troubleshooting it a bit today. This affects any component that makes use of function or system role messages and happens after an unsupported message role type has been inserted into the chat history (which creates an undefined chat message).

What I've found mirrors what some of the other commenters have noted: adjusting the node_modules/ai/rsc/dist/rsc-server.msj file's convertToLanguageModelPrompt() function to handle more of the message types defined in actions.tsx (specifically 'function' and 'system') seems to eliminate the error, although it's not a good solution since it's a workaround in the dependency and I'm not clear what other effects that change might be having.

Something in the RSC framework is choking on handling message types of 'function', 'tool', 'data', and in some cases 'system' and it's causing them to become undefined messages in the chat history array, which is creating downstream problems elsewhere in the app.

image
image

I'm also not clear on whether those roles are intended to be functional in the RSC or with GPT-4 and just aren't working here yet, or if the app is trying to do something it shouldn't be and using the roles incorrectly. The message role functionality is still very new to me!

@hemik000
Copy link

hemik000 commented May 11, 2024

Replacing all function roles to assistant works. And now ai SDK version [email protected] has support for system message https://github.com/vercel/ai/blob/main/packages/core/core/prompt/message.ts#L12

@athrael-soju
Copy link
Author

Honestly, with all the bugs combined, you're better off using the vercel-ai-src example

@jeremyphilemon
Copy link
Contributor

Thanks everyone for trying to debug the issue and I appreciate everyone's patience! Like you all had suspected, the messages property didn't follow the spec so the migration from render to streamUI wasn't trivial and caused errors.

#337 follows the messages spec and should fix the error! I will also have an upgrade guide up in the docs soon to provide more clarity about this change and prevent any future confusion.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests