Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Doc] Add API reference for offline inference #4710

Merged
merged 4 commits into from
May 14, 2024
Merged
Show file tree
Hide file tree
Changes from 3 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
8 changes: 7 additions & 1 deletion docs/source/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -67,6 +67,13 @@ Documentation
getting_started/quickstart
getting_started/examples/examples_index

.. toctree::
:maxdepth: 1
:caption: Offline Inference

offline_inference/llm
offline_inference/sampling_params

.. toctree::
:maxdepth: 1
:caption: Serving
Expand Down Expand Up @@ -101,7 +108,6 @@ Documentation
:maxdepth: 2
:caption: Developer Documentation

dev/sampling_params
dev/engine/engine_index
dev/kernel/paged_attention
dev/dockerfile/dockerfile
Expand Down
6 changes: 6 additions & 0 deletions docs/source/offline_inference/llm.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
API Client
simon-mo marked this conversation as resolved.
Show resolved Hide resolved
==========

.. autoclass:: vllm.LLM
:members:
:show-inheritance:
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
Sampling Params
===============
Sampling Parameters
===================

.. autoclass:: vllm.SamplingParams
:members:
4 changes: 2 additions & 2 deletions docs/source/serving/openai_compatible_server.md
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,7 @@ completion = client.chat.completions.create(
```

### Extra Parameters for Chat API
The following [sampling parameters (click through to see documentation)](../dev/sampling_params.rst) are supported.
The following [sampling parameters (click through to see documentation)](../offline_inference/sampling_params.rst) are supported.

```{literalinclude} ../../../vllm/entrypoints/openai/protocol.py
:language: python
Expand All @@ -65,7 +65,7 @@ The following extra parameters are supported:
```

### Extra Parameters for Completions API
The following [sampling parameters (click through to see documentation)](../dev/sampling_params.rst) are supported.
The following [sampling parameters (click through to see documentation)](../offline_inference/sampling_params.rst) are supported.

```{literalinclude} ../../../vllm/entrypoints/openai/protocol.py
:language: python
Expand Down