Skip to content

Commit

Permalink
feat: add proactiveRetry (#13)
Browse files Browse the repository at this point in the history
  • Loading branch information
aikoven committed Jul 14, 2023
1 parent c4c5e6e commit 8094a33
Show file tree
Hide file tree
Showing 3 changed files with 170 additions and 0 deletions.
59 changes: 59 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,7 @@ Abortable async function primitives and combinators.
- [`forever`](#forever)
- [`spawn`](#spawn)
- [`retry`](#retry)
- [`proactiveRetry`](#proactive-retry)
- [`execute`](#execute)
- [`abortable`](#abortable)
- [`run`](#run)
Expand Down Expand Up @@ -450,6 +451,64 @@ Retry a function with exponential backoff.

Rethrow error from this callback to prevent further retries.

### `proactiveRetry`

```ts
function proactiveRetry<T>(
signal: AbortSignal,
fn: (signal: AbortSignal, attempt: number) => Promise<T>,
options?: ProactiveRetryOptions,
): Promise<T>;

type ProactiveRetryOptions = {
baseMs?: number;
maxAttempts?: number;
onError?: (error: unknown, attempt: number) => void;
};
```

Proactively retry a function with exponential backoff.

Also known as hedging.

The function will be called multiple times in parallel until it succeeds, in
which case all the other calls will be aborted.

- `fn`

A function that will be called multiple times in parallel until it succeeds.
It receives:

- `signal`

`AbortSignal` that is aborted when the signal passed to `retry` is aborted,
or when the function succeeds.

- `attempt`

Attempt number starting with 0.

- `ProactiveRetryOptions.baseMs`

Base delay between attempts in milliseconds.

Defaults to 1000.

Example: if `baseMs` is 100, then retries will be attempted in 100ms, 200ms,
400ms etc (not counting jitter).

- `ProactiveRetryOptions.maxAttempts`

Maximum for the total number of attempts.

Defaults to `Infinity`.

- `ProactiveRetryOptions.onError`

Called after each failed attempt.

Rethrow error from this callback to prevent further retries.

### `execute`

```ts
Expand Down
1 change: 1 addition & 0 deletions src/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -9,3 +9,4 @@ export * from './race';
export * from './retry';
export * from './spawn';
export * from './run';
export * from './proactiveRetry';
110 changes: 110 additions & 0 deletions src/proactiveRetry.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,110 @@
import {isAbortError, catchAbortError} from './AbortError';
import {delay} from './delay';
import {execute} from './execute';

export type ProactiveRetryOptions = {
/**
* Base delay between attempts in milliseconds.
*
* Defaults to 1000.
*
* Example: if `baseMs` is 100, then retries will be attempted in 100ms,
* 200ms, 400ms etc (not counting jitter).
*/
baseMs?: number;
/**
* Maximum for the total number of attempts.
*
* Defaults to `Infinity`.
*/
maxAttempts?: number;
/**
* Called after each failed attempt.
*
* Rethrow error from this callback to prevent further retries.
*/
onError?: (error: unknown, attempt: number) => void;
};

/**
* Proactively retry a function with exponential backoff.
*
* Also known as hedging.
*
* The function will be called multiple times in parallel until it succeeds, in
* which case all the other calls will be aborted.
*/
export function proactiveRetry<T>(
signal: AbortSignal,
fn: (signal: AbortSignal, attempt: number) => Promise<T>,
options: ProactiveRetryOptions = {},
): Promise<T> {
const {baseMs = 1000, onError, maxAttempts = Infinity} = options;

return execute(signal, (resolve, reject) => {
const innerAbortController = new AbortController();
let attemptsExhausted = false;

const promises = new Map</* attempt */ number, Promise<T>>();

function handleFulfilled(value: T) {
innerAbortController.abort();
promises.clear();

resolve(value);
}

function handleRejected(err: unknown, attempt: number) {
promises.delete(attempt);

if (attemptsExhausted && promises.size === 0) {
reject(err);

return;
}

if (isAbortError(err)) {
return;
}

if (onError) {
try {
onError(err, attempt);
} catch (err) {
innerAbortController.abort();
promises.clear();

reject(err);
}
}
}

async function makeAttempts(signal: AbortSignal) {
for (let attempt = 0; ; attempt++) {
const promise = fn(signal, attempt);

promises.set(attempt, promise);

promise.then(handleFulfilled, err => handleRejected(err, attempt));

if (attempt + 1 >= maxAttempts) {
break;
}

// https://aws.amazon.com/ru/blogs/architecture/exponential-backoff-and-jitter/
const backoff = Math.pow(2, attempt) * baseMs;
const delayMs = Math.round((backoff * (1 + Math.random())) / 2);

await delay(signal, delayMs);
}

attemptsExhausted = true;
}

makeAttempts(innerAbortController.signal).catch(catchAbortError);

return () => {
innerAbortController.abort();
};
});
}

0 comments on commit 8094a33

Please sign in to comment.