Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Comsumption with OpenAI #109

Closed
ClicShopping opened this issue Apr 27, 2024 · 6 comments
Closed

Comsumption with OpenAI #109

ClicShopping opened this issue Apr 27, 2024 · 6 comments
Labels
documentation Improvements or additions to documentation enhancement New feature or request

Comments

@ClicShopping
Copy link

ClicShopping commented Apr 27, 2024

Hello,
Do you include this point inside LLphant to know to consumption in response for OpenAI ?.
I do not find it ?

example
`

  /**
   * @param $client
   * @param string $question
   * @return mixed
   * @throws \Exception
   */
  private static function saveOpenAiUsage($client, string $question): mixed
  {
    try {
      $result = $client['choices'][0]['message']['content'];

      $array_usage = [
        'promptTokens' => $client->usage->promptTokens,
        'completionTokens' => $client->usage->completionTokens,
        'totalTokens' => $client->usage->totalTokens,
      ];

      static::saveData($question, $result, $array_usage);

      return $result;
    } catch (RuntimeException $e) {
      throw new \Exception('Error appears, please look the console error');

      return false;
    }
  }

`

@samuelgjekic
Copy link
Contributor

Hey! Here is a example on how to get the usage values in in LLPhant:

    public function generateText(string $prompt): string
    {
        $answer = $this->generate($prompt);
        $this->handleTools($answer);

        $usageArray = $answer->usage->toArray(); // Get the usage values and store in array
    
        // Access the values 
        $promptToken = $usageArray['prompt_tokens']; 
        $totalTokens = $usageArray['total_tokens'];
        $completionTokens = $usageArray['completion_tokens'];

        /* Do whatever with the values */ 


        return $answer->choices[0]->message->content ?? '';
    }

This is the generateText function in OpenAiChat.php

You can access the usage values from the response.

Hope this helped!

@ClicShopping
Copy link
Author

ClicShopping commented Apr 27, 2024

Thank you.
Do you insert this features in the next release ?

@samuelgjekic
Copy link
Contributor

Thank you. Do you insert this features in the next release ?

No problem hope it helped! Well i would love to contribute to the library. If i get permission from owner to do it, i will add it :)

@MaximeThoonsen
Copy link
Contributor

Hey @samuelgjekic,

I would love to have a contribution from you on this!

@MaximeThoonsen MaximeThoonsen added documentation Improvements or additions to documentation enhancement New feature or request labels Apr 28, 2024
@ezimuel
Copy link
Collaborator

ezimuel commented May 7, 2024

@ClicShopping, @samuelgjekic, @MaximeThoonsen I provided the PR #116 to store the last response from OpenAI. In this way we can have the token usage including also the other response objects.

@MaximeThoonsen
Copy link
Contributor

It' done 🚀

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
documentation Improvements or additions to documentation enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

4 participants