Skip to content

Commit

Permalink
Update to .NET 8.0. Add support for new features (#136)
Browse files Browse the repository at this point in the history
  • Loading branch information
marcominerva committed Dec 11, 2023
2 parents b655376 + cc47527 commit 28cd3fb
Show file tree
Hide file tree
Showing 132 changed files with 1,754 additions and 452 deletions.
2 changes: 1 addition & 1 deletion .github/workflows/publish.yml
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ on:
workflow_dispatch:

env:
NET_VERSION: '7.x'
NET_VERSION: '8.x'
PROJECT_NAME: src/ChatGptNet
PROJECT_FILE: ChatGptNet.csproj
RELEASE_NAME: ChatGptNet
Expand Down
135 changes: 108 additions & 27 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -33,6 +33,11 @@ builder.Services.AddChatGpt(options =>
options.DefaultEmbeddingModel = "text-embedding-ada-002",
options.MessageLimit = 16; // Default: 10
options.MessageExpiration = TimeSpan.FromMinutes(5); // Default: 1 hour
options.DefaultParameters = new ChatGptParameters
{
MaxTokens = 800,
Temperature = 0.7
};
});
```

Expand All @@ -52,7 +57,9 @@ builder.Services.AddChatGpt(options =>
- 2023-05-15
- 2023-06-01-preview
- 2023-07-01-preview
- 2023-08-01-preview (default)
- 2023-08-01-preview
- 2023-09-01-preview
- 2023-12-01-preview (default)
- _AuthenticationType_: it specifies if the key is an actual API Key or an [Azure Active Directory token](https://learn.microsoft.com/azure/cognitive-services/openai/how-to/managed-identity) (optional, default: "ApiKey").

### DefaultModel and DefaultEmbeddingModel
Expand All @@ -63,7 +70,15 @@ Even if it is not a strictly necessary for chat conversation, the library suppor

##### OpenAI

Currently available models are: _gpt-3.5-turbo_, _gpt-3.5-turbo-16k_, _gpt-4_ and _gpt-4-32k_. They have fixed names, available in the [OpenAIChatGptModels.cs file](https://github.com/marcominerva/ChatGptNet/blob/master/src/ChatGptNet/Models/OpenAIChatGptModels.cs).
Currently available models are:
- gpt-3.5-turbo,
- gpt-3.5-turbo-16k,
- gpt-4,
- gpt-4-32k
- gpt-4-1106-preview
- gpt-4-vision-preview

They have fixed names, available in the [OpenAIChatGptModels.cs file](https://github.com/marcominerva/ChatGptNet/blob/master/src/ChatGptNet/Models/OpenAIChatGptModels.cs).

##### Azure OpenAI Service

Expand Down Expand Up @@ -134,14 +149,16 @@ The configuration can be automatically read from [IConfiguration](https://learn.
"DefaultEmbeddingModel": "text-embedding-ada-002", // Optional, set it if you want to use embedding
"MessageLimit": 20,
"MessageExpiration": "00:30:00",
"ThrowExceptionOnError": true
"ThrowExceptionOnError": true // Optional, default: true
//"User": "UserName",
//"DefaultParameters": {
// "Temperature": 0.8,
// "TopP": 1,
// "MaxTokens": 500,
// "PresencePenalty": 0,
// "FrequencyPenalty": 0
// "FrequencyPenalty": 0,
// "ResponseFormat": { "Type": "text" }, // Allowed values for Type: text (default) or json_object
// "Seed": 42 // Optional (any integer value)
//}
}
```
Expand Down Expand Up @@ -214,6 +231,49 @@ var content = response.GetContent();
> **Note**
If the response has been filtered by the content filtering system, **GetContent** will return *null*. So, you should always check the `response.IsContentFiltered` property before trying to access to the actual content.

#### Using parameters

Using configuration, it is possible to set default parameters for chat completion. However, we can also specify parameters for each request, using the **AskAsync** or **AskStreamAsync** overloads that accepts a [ChatGptParameters](https://github.com/marcominerva/ChatGptNet/blob/master/src/ChatGptNet/Models/ChatGptParameters.cs) object:

```csharp
var response = await chatGptClient.AskAsync(conversationId, message, new ChatGptParameters
{
MaxTokens = 150,
Temperature = 0.7
});
```

We don't need to specify all the parameters, only the ones we want to override. The other ones will be taken from the default configuration.

##### Seed and system fingerprint

ChatGPT is known to be non deterministic. This means that the same input can produce different outputs. To try to control this behavior, we can use the _Temperature_ and _TopP_ parameters. For example, setting the _Temperature_ to values near to 0 makes the model more deterministic, while setting it to values near to 1 makes the model more creative.
However, this is not always enough to get the same output for the same input. To address this issue, OpenAI introduced the **Seed** parameter. If specified, the model should sample deterministically, such that repeated requests with the same seed and parameters should return the same result. Nevertheless, determinism is not guaranteed neither in this case, and you should refer to the _SystemFingerprint_ response parameter to monitor changes in the backend. Changes in this values mean that the backend configuration has changed, and this might impact determinism.

As always, the _Seed_ property can be specified in the default configuration or in the **AskAsync** or **AskStreamAsync** overloads that accepts a [ChatGptParameters](https://github.com/marcominerva/ChatGptNet/blob/master/src/ChatGptNet/Models/ChatGptParameters.cs).

> **Note**
_Seed_ and _SystemFingerprint_ are only supported by the most recent models, such as _gpt-4-1106-preview_.

##### Response format

If you want to forse the response in JSON format, you can use the _ResponseFormat_ parameter:

```csharp
var response = await chatGptClient.AskAsync(conversationId, message, new ChatGptParameters
{
ResponseFormat = ChatGptResponseFormat.Json,
});
```

In this way, the response will always be a valid JSON. Note that must also instruct the model to produce JSON via a system or user message. If you don't do this, the model will return an error.


As always, the _ResponseFormat_ property can be specified in the default configuration or in the **AskAsync** or **AskStreamAsync** overloads that accepts a [ChatGptParameters](https://github.com/marcominerva/ChatGptNet/blob/master/src/ChatGptNet/Models/ChatGptParameters.cs).

> **Note**
_ResponseFormat_ is only supported by the most recent models, such as _gpt-4-1106-preview_.

### Handling a conversation

The **AskAsync** and **AskStreamAsync** (see below) methods provides overloads that require a *conversationId* parameter. If we pass an empty value, a random one is generated and returned.
Expand Down Expand Up @@ -284,7 +344,6 @@ app.MapGet("/api/chat/stream", (Guid? conversationId, string message, IChatGptCl
> **Note**
If the response has been filtered by the content filtering system, the **AsDeltas** method in the _foreach_ will return *nulls* string.


The library is 100% compatible also with Blazor WebAssembly applications:

![](https://raw.githubusercontent.com/marcominerva/ChatGptNet/master/assets/ChatGptBlazor.WasmStreaming.gif)
Expand Down Expand Up @@ -321,17 +380,10 @@ await chatGptClient.DeleteConversationAsync(conversationId, preserveSetup: false

The _preserveSetup_ argument allows to decide whether mantain also the _system_ message that has been set with the **SetupAsync** method (default: _false_).

## Function calling
## Tool and Function calling

With function calling, we can describe functions and have the model intelligently choose to output a JSON object containing arguments to call those functions. This is a new way to more reliably connect GPT's capabilities with external tools and APIs.

> **Note**
Currently, on Azure OpenAI Service, function calling is supported in the following models in API version `2023-07-01-preview` or later:
>- gpt-35-turbo-0613
>- gpt-35-turbo-16k-0613
>- gpt-4-0613
>- gpt-4-32k-0613
**ChatGptNet** fully supports function calling by providing an overload of the **AskAsync** method that allows to specify function definitions. If this parameter is supplied, then the model will decide when it is appropiate to use one the functions. For example:

```csharp
Expand All @@ -352,7 +404,7 @@ var functions = new List<ChatGptFunction>
"format": {
"type": "string",
"enum": ["celsius", "fahrenheit"],
"description": "The temperature unit to use. Infer this from the users location."
"description": "The temperature unit to use. Infer this from the user's location."
}
},
"required": ["location", "format"]
Expand All @@ -374,7 +426,7 @@ var functions = new List<ChatGptFunction>
"format": {
"type": "string",
"enum": ["celsius", "fahrenheit"],
"description": "The temperature unit to use. Infer this from the users location."
"description": "The temperature unit to use. Infer this from the user's location."
},
"daysNumber": {
"type": "integer",
Expand All @@ -387,21 +439,21 @@ var functions = new List<ChatGptFunction>
}
};

var functionParameters = new ChatGptFunctionParameters
var toolParameters = new ChatGptToolParameters
{
FunctionCall = ChatGptFunctionCalls.Auto, // This is the default if functions are present.
FunctionCall = ChatGptToolChoices.Auto, // This is the default if functions are present.
Functions = functions
};

var response = await chatGptClient.AskAsync("What is the weather like in Taggia?", functionParameters);
var response = await chatGptClient.AskAsync("What is the weather like in Taggia?", toolParameters);
```

We can pass an arbitrary number of functions, each one with a name, a description and a JSON schema describing the function parameters, following the [JSON Schema references](https://json-schema.org/understanding-json-schema). Under the hood, functions are injected into the system message in a syntax the model has been trained on. This means functions count against the model's context limit and are billed as input tokens.

The response object returned by the **AskAsync** method provides a property to check if the model has selected a function call:

```csharp
if (response.IsFunctionCall)
if (response.ContainsFunctionCalls())
{
Console.WriteLine("I have identified a function to call:");

Expand All @@ -414,21 +466,50 @@ if (response.IsFunctionCall)

This code will print something like this:

I have identified a function to call:
GetCurrentWeather
{
"location": "Taggia",
"format": "celsius"
}
```
I have identified a function to call:
GetCurrentWeather
{
"location": "Taggia",
"format": "celsius"
}
```

Note that the API will not actually execute any function calls. It is up to developers to execute function calls using model outputs.

After the actual execution, we need to call the **AddFunctionResponseAsync** method on the **ChatGptClient** to add the response to the conversation history, just like a standard message, so that it will be automatically used for chat completion:
After the actual execution, we need to call the **AddToolResponseAsync** method on the **ChatGptClient** to add the response to the conversation history, just like a standard message, so that it will be automatically used for chat completion:

```csharp
// Calls the remote function API.
var functionResponse = await GetWeatherAsync(functionCall.Arguments);
await chatGptClient.AddToolResponseAsync(conversationId, functionCall, functionResponse);
```

Newer models like _gpt-4-1106-preview_ support a more general approach to functions, the **Tool calling**. When you send a request, you can specify a list of tools the model may call. Currently, only functions are supported, but in future release other types of tools will be available.

To use Tool calling instead of direct Function calling, you need to set the _ToolChoice_ and _Tools_ properties in the **ChatGptToolParameters** object (instead of _FunctionCall_ and _Function_, as in previous example):

```csharp
var toolParameters = new ChatGptToolParameters
{
ToolChoice = ChatGptToolChoices.Auto, // This is the default if functions are present.
Tools = functions.ToTools()
};
```

The **ToTools** extension method is used to convert a list of [ChatGptFunction](https://github.com/marcominerva/ChatGptNet/blob/master/src/ChatGptNet/Models/ChatGptFunction.cs) to a list of tools.

If you use this new approach, of course you still need to check if the model has selected a tool call, using the same approach shown before.
Then, after the actual execution of the function, you have to call the **AddToolResponseAsync** method, but in this case you need to specify the tool (not the function) to which the response refers:

```csharp
var tool = response.GetToolCalls()!.First();
var functionCall = response.GetFunctionCall()!;

// Calls the remote function API.
var functionResponse = await GetWeatherAsync(functionCall.Arguments);
await chatGptClient.AddFunctionResponseAsync(conversationId, functionCall.Name, functionResponse);

await chatGptClient.AddToolResponseAsync(conversationId, tool, functionResponse);
```

Check out the [Function calling sample](https://github.com/marcominerva/ChatGptNet/blob/master/samples/ChatGptFunctionCallingConsole/Application.cs#L18) for a complete implementation of this workflow.
Expand Down
22 changes: 22 additions & 0 deletions docs/ChatGptNet.Extensions/ChatGptChoiceExtensions.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,22 @@
# ChatGptChoiceExtensions class

Contains extension methods for the [`ChatGptChoice`](../ChatGptNet.Models/ChatGptChoice.md) class.

```csharp
public static class ChatGptChoiceExtensions
```

## Public Members

| name | description |
| --- | --- |
| static [ContainsFunctionCalls](ChatGptChoiceExtensions/ContainsFunctionCalls.md)(…) | Gets a value indicating whether this choice contains a function call. |
| static [ContainsToolCalls](ChatGptChoiceExtensions/ContainsToolCalls.md)(…) | Gets a value indicating whether this choice contains at least one tool call. |
| static [GetFunctionCall](ChatGptChoiceExtensions/GetFunctionCall.md)(…) | Gets the first function call of the message, if any. |

## See Also

* namespace [ChatGptNet.Extensions](../ChatGptNet.md)
* [ChatGptChoiceExtensions.cs](https://github.com/marcominerva/ChatGptNet/tree/master/src/ChatGptNet/Extensions/ChatGptChoiceExtensions.cs)
<!-- DO NOT EDIT: generated by xmldocmd for ChatGptNet.dll -->
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
# ChatGptChoiceExtensions.ContainsFunctionCalls method

Gets a value indicating whether this choice contains a function call.

```csharp
public static bool ContainsFunctionCalls(this ChatGptChoice choice)
```

## See Also

* class [ChatGptChoice](../../ChatGptNet.Models/ChatGptChoice.md)
* class [ChatGptChoiceExtensions](../ChatGptChoiceExtensions.md)
* namespace [ChatGptNet.Extensions](../../ChatGptNet.md)

<!-- DO NOT EDIT: generated by xmldocmd for ChatGptNet.dll -->
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
# ChatGptChoiceExtensions.ContainsToolCalls method

Gets a value indicating whether this choice contains at least one tool call.

```csharp
public static bool ContainsToolCalls(this ChatGptChoice choice)
```

## See Also

* class [ChatGptChoice](../../ChatGptNet.Models/ChatGptChoice.md)
* class [ChatGptChoiceExtensions](../ChatGptChoiceExtensions.md)
* namespace [ChatGptNet.Extensions](../../ChatGptNet.md)

<!-- DO NOT EDIT: generated by xmldocmd for ChatGptNet.dll -->
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
# ChatGptChoiceExtensions.GetFunctionCall method

Gets the first function call of the message, if any.

```csharp
public static ChatGptFunctionCall? GetFunctionCall(this ChatGptChoice choice)
```

## Return Value

The first function call of the message, if any.

## See Also

* class [ChatGptFunctionCall](../../ChatGptNet.Models/ChatGptFunctionCall.md)
* class [ChatGptChoice](../../ChatGptNet.Models/ChatGptChoice.md)
* class [ChatGptChoiceExtensions](../ChatGptChoiceExtensions.md)
* namespace [ChatGptNet.Extensions](../../ChatGptNet.md)

<!-- DO NOT EDIT: generated by xmldocmd for ChatGptNet.dll -->
20 changes: 20 additions & 0 deletions docs/ChatGptNet.Extensions/ChatGptFunctionExtensions.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
# ChatGptFunctionExtensions class

Provides extension methods for working with [`ChatGptFunction`](../ChatGptNet.Models/ChatGptFunction.md) instances.

```csharp
public static class ChatGptFunctionExtensions
```

## Public Members

| name | description |
| --- | --- |
| static [ToTools](ChatGptFunctionExtensions/ToTools.md)(…) | Converts a list of [`ChatGptFunction`](../ChatGptNet.Models/ChatGptFunction.md) to the corresponding tool definitions. |

## See Also

* namespace [ChatGptNet.Extensions](../ChatGptNet.md)
* [ChatGptFunctionExtensions.cs](https://github.com/marcominerva/ChatGptNet/tree/master/src/ChatGptNet/Extensions/ChatGptFunctionExtensions.cs)
<!-- DO NOT EDIT: generated by xmldocmd for ChatGptNet.dll -->
24 changes: 24 additions & 0 deletions docs/ChatGptNet.Extensions/ChatGptFunctionExtensions/ToTools.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
# ChatGptFunctionExtensions.ToTools method

Converts a list of [`ChatGptFunction`](../../ChatGptNet.Models/ChatGptFunction.md) to the corresponding tool definitions.

```csharp
public static IEnumerable<ChatGptTool> ToTools(this IEnumerable<ChatGptFunction> functions)
```

| parameter | description |
| --- | --- |
| functions | The list of [`ChatGptFunction`](../../ChatGptNet.Models/ChatGptFunction.md) |

## Return Value

The list of [`ChatGptTool`](../../ChatGptNet.Models/ChatGptTool.md) objects that contains the specified *functions*.

## See Also

* class [ChatGptTool](../../ChatGptNet.Models/ChatGptTool.md)
* class [ChatGptFunction](../../ChatGptNet.Models/ChatGptFunction.md)
* class [ChatGptFunctionExtensions](../ChatGptFunctionExtensions.md)
* namespace [ChatGptNet.Extensions](../../ChatGptNet.md)

<!-- DO NOT EDIT: generated by xmldocmd for ChatGptNet.dll -->
5 changes: 5 additions & 0 deletions docs/ChatGptNet.Extensions/ChatGptResponseExtensions.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,11 @@ public static class ChatGptResponseExtensions
| name | description |
| --- | --- |
| static [AsDeltas](ChatGptResponseExtensions/AsDeltas.md)(…) | Returns an IAsyncEnumerable that allows to enumerate all the partial message deltas. |
| static [ContainsFunctionCalls](ChatGptResponseExtensions/ContainsFunctionCalls.md)(…) | Gets a value indicating whether the first choice, if available, contains a function call. |
| static [ContainsToolCalls](ChatGptResponseExtensions/ContainsToolCalls.md)(…) | Gets a value indicating whether the first choice, if available, contains a tool call. |
| static [GetContent](ChatGptResponseExtensions/GetContent.md)(…) | Gets the content of the first choice, if available. |
| static [GetFunctionCall](ChatGptResponseExtensions/GetFunctionCall.md)(…) | Gets or sets the function call for the message of the first choice, if available. |
| static [GetToolCalls](ChatGptResponseExtensions/GetToolCalls.md)(…) | Gets the tool calls for the message of the first choice, if available. |

## See Also

Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
# ChatGptResponseExtensions.ContainsFunctionCalls method

Gets a value indicating whether the first choice, if available, contains a function call.

```csharp
public static bool ContainsFunctionCalls(this ChatGptResponse response)
```

## See Also

* method [GetFunctionCall](./GetFunctionCall.md)
* class [ChatGptFunctionCall](../../ChatGptNet.Models/ChatGptFunctionCall.md)
* class [ChatGptResponse](../../ChatGptNet.Models/ChatGptResponse.md)
* class [ChatGptResponseExtensions](../ChatGptResponseExtensions.md)
* namespace [ChatGptNet.Extensions](../../ChatGptNet.md)

<!-- DO NOT EDIT: generated by xmldocmd for ChatGptNet.dll -->

0 comments on commit 28cd3fb

Please sign in to comment.