Skip to content
This repository has been archived by the owner on Mar 6, 2024. It is now read-only.

Prompt limit issue fixed #501

Open
wants to merge 6 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from 4 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
28 changes: 18 additions & 10 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -49,14 +49,23 @@ configure the required environment variables, such as `GITHUB_TOKEN` and
`OPENAI_API_KEY`. For more information on usage, examples, contributing, and
FAQs, you can refer to the sections below.

- [Overview](#overview)
- [Professional Version of CodeRabbit](#professional-version-of-coderabbit)
- [Reviewer Features](#reviewer-features)
- [Install instructions](#install-instructions)
- [Conversation with CodeRabbit](#conversation-with-coderabbit)
- [Examples](#examples)
- [Contribute](#contribute)
- [FAQs](#faqs)
- [AI-based PR reviewer and summarizer](#ai-based-pr-reviewer-and-summarizer)
- [Overview](#overview)
- [Reviewer Features:](#reviewer-features)
- [CodeRabbit Pro](#coderabbit-pro)
- [Install instructions](#install-instructions)
- [Environment variables](#environment-variables)
- [Models: `gpt-4` and `gpt-3.5-turbo`](#models-gpt-4-and-gpt-35-turbo)
- [Prompts \& Configuration](#prompts--configuration)
- [Conversation with CodeRabbit](#conversation-with-coderabbit)
- [Ignoring PRs](#ignoring-prs)
- [Examples](#examples)
- [Contribute](#contribute)
- [Developing](#developing)
- [FAQs](#faqs)
- [Review pull requests from forks](#review-pull-requests-from-forks)
- [Inspect the messages between OpenAI server](#inspect-the-messages-between-openai-server)
- [Disclaimer](#disclaimer)


## Install instructions
Expand Down Expand Up @@ -208,12 +217,11 @@ Install the dependencies
$ npm install
```

Build the typescript and package it for distribution

```bash
$ npm run build && npm run package
```


## FAQs

### Review pull requests from forks
Expand Down
47 changes: 47 additions & 0 deletions __tests__/tokenizer.test.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,47 @@
import {splitPrompt} from './../src/tokenizer' // Import your module with the splitPrompt function

describe('splitPrompt function', () => {
it('should split a prompt into smaller pieces', async () => {
const maxTokens = 10 // Adjust this as needed
const prompt = 'This is a test prompt for splitting into smaller pieces.'

const result = await splitPrompt(maxTokens, prompt)

// Calculate the expected output based on the maxTokens value
const expectedOutput = [
'This is a',
'test',
'prompt for',
'splitting',
'into',
'smaller',
'pieces.'
]

expect(result).toEqual(expectedOutput)
})
Infracloud-harsh marked this conversation as resolved.
Show resolved Hide resolved

it('should handle a prompt smaller than maxTokens', async () => {
const maxTokens = 100 // A large value
const prompt = 'A very short prompt.'

const result = await splitPrompt(maxTokens, prompt)

// The prompt is already smaller than maxTokens, so it should return an array with the entire prompt.
const expectedOutput = 'A very short prompt.'

expect(result).toEqual(expectedOutput)
})
Infracloud-harsh marked this conversation as resolved.
Show resolved Hide resolved

it('should handle an empty prompt', async () => {
const maxTokens = 10
const prompt = ''

const result = await splitPrompt(maxTokens, prompt)

// An empty prompt should result in an empty array.
const expectedOutput: string[] | string = ''

expect(result).toEqual(expectedOutput)
})
Infracloud-harsh marked this conversation as resolved.
Show resolved Hide resolved
})