Skip to content

A tool that transforms OpenAI API requests into Azure OpenAI API requests, allowing OpenAI-compatible applications to seamlessly use Azure OpenAI. 一个 OpenAI API 的代理工具,能将 OpenAI API 请求转为 Azure OpenAI API 请求,从而让只支持 OpenAI 的应用程序无缝使用 Azure OpenAI。

License

scalaone/azure-openai-proxy

Repository files navigation

Azure OpenAI Proxy

English | 简体中文

Azure OpenAI Proxy is a tool that transforms OpenAI API requests into Azure OpenAI API requests, allowing OpenAI-compatible applications to seamlessly use Azure Open AI.

Prerequisites

An Azure OpenAI account is required to use Azure OpenAI Proxy.

Azure Deployment

Deploy to Azure

Remember to:

  • Select the region that matches your Azure OpenAI resource for best performance.
  • If deployment fails because the 'proxywebapp' name is already taken, change the resource prefix and redeploy.
  • The deployed proxy app is part of a B1 pricing tier Azure web app plan, which can be modified in the Azure Portal after deployment.

Docker Deployment

To deploy using Docker, execute the following command:

docker run -d -p 3000:3000 scalaone/azure-openai-proxy

Local Execution and Testing

Follow these steps:

  1. Install NodeJS 20.
  2. Clone the repository in the command line window.
  3. Run npm install to install the dependencies.
  4. Run npm start to start the application.
  5. Use the script below for testing. Replace AZURE_RESOURCE_ID, AZURE_MODEL_DEPLOYMENT, and AZURE_API_KEY before running. The default value for AZURE_API_VERSION is 2024-02-01 and is optional.
Test script ```bash curl -X "POST" "http://localhost:3000/v1/chat/completions" \ -H 'Authorization: AZURE_RESOURCE_ID:AZURE_MODEL_DEPLOYMENT:AZURE_API_KEY:AZURE_API_VERSION' \ -H 'Content-Type: application/json; charset=utf-8' \ -d $'{ "messages": [ { "role": "system", "content": "You are an AI assistant that helps people find information." }, { "role": "user", "content": "hi." } ], "temperature": 1, "model": "gpt-3.5-turbo", "stream": false }' ```

Tested Applications

The azure-openai-proxy has been tested and confirmed to work with the following applications:

Application Name Docker-compose File for E2E Test
chatgpt-lite docker-compose.yml
chatgpt-minimal docker-compose.yml
chatgpt-next-web docker-compose.yml
chatbot-ui docker-compose.yml
chatgpt-web docker-compose.yml

To test locally, follow these steps:

  1. Clone the repository in a command-line window.
  2. Update the OPENAI_API_KEY environment variable with AZURE_RESOURCE_ID:AZURE_MODEL_DEPLOYMENT:AZURE_API_KEY. Alternatively, update the OPENAI_API_KEY value in the docker-compose.yml file directly.
  3. Navigate to the directory containing the docker-compose.yml file for the application you want to test.
  4. Execute the build command: docker-compose build.
  5. Start the service: docker-compose up -d.
  6. Access the application locally using the port defined in the docker-compose.yml file. For example, visit http://localhost:3000.

FAQs

Q: What are `AZURE_RESOURCE_ID`,`AZURE_MODEL_DEPLOYMENT`, and `AZURE_API_KEY`? A: These can be found in the Azure management portal. See the image below for reference: ![resource-and-model](./docs/images/resource-and-model.jpg)
Q: How can I use gpt-4 and gpt-4-32k models? A: To use gpt-4 and gpt-4-32k models, follow the key format below: `AZURE_RESOURCE_ID:gpt-3.5-turbo|AZURE_MODEL_DEPLOYMENT,gpt-4|AZURE_MODEL_DEPLOYMENT,gpt-4-32k|AZURE_MODEL_DEPLOYMENT:AZURE_API_KEY:AZURE_API_VERSION`

Contributing

We welcome all PR submissions.

About

A tool that transforms OpenAI API requests into Azure OpenAI API requests, allowing OpenAI-compatible applications to seamlessly use Azure OpenAI. 一个 OpenAI API 的代理工具,能将 OpenAI API 请求转为 Azure OpenAI API 请求,从而让只支持 OpenAI 的应用程序无缝使用 Azure OpenAI。

Topics

Resources

License

Stars

Watchers

Forks