vLLM is an inference and serving engine for large language models (LLMs). In version 0.8.0 up to but excluding 0.9.0, the vLLM backend used with the /v1/chat/completions OpenAPI endpoint fails to validate unexpected or malformed input in the "pattern" and "type" fields when the tools functionality is invoked. These inputs are not validated before being compiled or parsed, causing a crash of the inference worker with a single request. The worker will remain down until it is restarted. Version 0.9.0 fixes the issue.
Metrics
Affected Vendors & Products
References
History
Sat, 31 May 2025 03:15:00 +0000
Type | Values Removed | Values Added |
---|---|---|
References |
| |
Metrics |
threat_severity
|
threat_severity
|
Fri, 30 May 2025 19:15:00 +0000
Type | Values Removed | Values Added |
---|---|---|
Metrics |
ssvc
|
Fri, 30 May 2025 18:45:00 +0000
Type | Values Removed | Values Added |
---|---|---|
Description | vLLM is an inference and serving engine for large language models (LLMs). In version 0.8.0 up to but excluding 0.9.0, the vLLM backend used with the /v1/chat/completions OpenAPI endpoint fails to validate unexpected or malformed input in the "pattern" and "type" fields when the tools functionality is invoked. These inputs are not validated before being compiled or parsed, causing a crash of the inference worker with a single request. The worker will remain down until it is restarted. Version 0.9.0 fixes the issue. | |
Title | vLLM Tool Schema allows DoS via Malformed pattern and type Fields | |
Weaknesses | CWE-20 | |
References |
| |
Metrics |
cvssV3_1
|

Status: PUBLISHED
Assigner: GitHub_M
Published: 2025-05-30T18:38:45.505Z
Updated: 2025-05-30T18:56:56.406Z
Reserved: 2025-05-28T18:49:07.582Z
Link: CVE-2025-48944

Updated: 2025-05-30T18:56:52.880Z

Status : Awaiting Analysis
Published: 2025-05-30T19:15:30.433
Modified: 2025-06-02T17:32:17.397
Link: CVE-2025-48944
