Skip to content

Commit

Permalink
restoring default in main vllm code for detokenize
Browse files Browse the repository at this point in the history
  • Loading branch information
maleksan85 committed Jan 18, 2025
1 parent 87d256a commit f42c45e
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion vllm/sampling_params.py
Original file line number Diff line number Diff line change
Expand Up @@ -195,7 +195,7 @@ class SamplingParams(
# NOTE: This parameter is only exposed at the engine level for now.
# It is not exposed in the OpenAI API server, as the OpenAI API does
# not support returning only a list of token IDs.
detokenize: bool = False
detokenize: bool = True
skip_special_tokens: bool = True
spaces_between_special_tokens: bool = True
# Optional[List[LogitsProcessor]] type. We use Any here because
Expand Down

0 comments on commit f42c45e

Please sign in to comment.