This API uses the Open AI tiktoken openai/tiktoken: tiktoken is a fast BPE tokeniser for use with OpenAI's models package to encode and decode text.
To get started, you'll need to have Docker installed on your system. Once you have Docker installed, you can build and run the API using the following commands:
# Build the Docker image
docker build -t tiktoken-fastapi .
# Run the Docker container
docker run -p 8000:8000 tiktoken-fastapi
This will start the API on port 8000.
The API provides the following endpoints:
This endpoint takes a JSON payload containing a text
field and returns a JSON response containing the encoded text.
{
"text": "Hello, world!"
}
{
"encoded_text": [319, 322, 5, 248, 2, 463]
}
This endpoint takes a JSON payload containing an encoded_text
field and returns a JSON response containing the decoded text.
{
"encoded_text": [319, 322, 5, 248, 2, 463]
}
{
"decoded_text": "Hello, world!"
}