Skip to content

Commit

Permalink
[Doc] Add API reference for offline inference (vllm-project#4710)
Browse files Browse the repository at this point in the history
  • Loading branch information
DarkLight1337 authored and tjohnson31415 committed May 16, 2024
1 parent f4270f2 commit c75ceb4
Show file tree
Hide file tree
Showing 4 changed files with 17 additions and 5 deletions.
8 changes: 7 additions & 1 deletion docs/source/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -67,6 +67,13 @@ Documentation
getting_started/quickstart
getting_started/examples/examples_index

.. toctree::
:maxdepth: 1
:caption: Offline Inference

offline_inference/llm
offline_inference/sampling_params

.. toctree::
:maxdepth: 1
:caption: Serving
Expand Down Expand Up @@ -101,7 +108,6 @@ Documentation
:maxdepth: 2
:caption: Developer Documentation

dev/sampling_params
dev/engine/engine_index
dev/kernel/paged_attention
dev/dockerfile-ubi/dockerfile-ubi
Expand Down
6 changes: 6 additions & 0 deletions docs/source/offline_inference/llm.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
LLM Class
==========

.. autoclass:: vllm.LLM
:members:
:show-inheritance:
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
Sampling Params
===============
Sampling Parameters
===================

.. autoclass:: vllm.SamplingParams
:members:
4 changes: 2 additions & 2 deletions docs/source/serving/openai_compatible_server.md
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,7 @@ completion = client.chat.completions.create(
```

### Extra Parameters for Chat API
The following [sampling parameters (click through to see documentation)](../dev/sampling_params.rst) are supported.
The following [sampling parameters (click through to see documentation)](../offline_inference/sampling_params.rst) are supported.

```{literalinclude} ../../../vllm/entrypoints/openai/protocol.py
:language: python
Expand All @@ -65,7 +65,7 @@ The following extra parameters are supported:
```

### Extra Parameters for Completions API
The following [sampling parameters (click through to see documentation)](../dev/sampling_params.rst) are supported.
The following [sampling parameters (click through to see documentation)](../offline_inference/sampling_params.rst) are supported.

```{literalinclude} ../../../vllm/entrypoints/openai/protocol.py
:language: python
Expand Down

0 comments on commit c75ceb4

Please sign in to comment.