[Doc] Update max_tokens to max_completion_tokens in all docs (#6248)
### What this PR does / why we need it?
Fix:
```
DeprecationWarning: max_tokens is deprecated in favor of the max_completion_tokens field.
```
- vLLM version: v0.14.1
- vLLM main:
d68209402d
Signed-off-by: shen-shanshan <467638484@qq.com>
This commit is contained in:
@@ -142,7 +142,7 @@ llm = LLM(
|
||||
)
|
||||
|
||||
sampling_params = SamplingParams(
|
||||
max_tokens=512
|
||||
max_completion_tokens=512
|
||||
)
|
||||
|
||||
image_messages = [
|
||||
@@ -238,7 +238,7 @@ llm = LLM(
|
||||
)
|
||||
|
||||
sampling_params = SamplingParams(
|
||||
max_tokens=512
|
||||
max_completion_tokens=512
|
||||
)
|
||||
|
||||
image_messages = [
|
||||
|
||||
Reference in New Issue
Block a user