In LangChain through 0.0.131, the LLMMathChain chain allows prompt injection attacks that can execute arbitrary code via the Python exec method.
Metrics
Affected Vendors & Products
References
History
Wed, 12 Feb 2025 17:15:00 +0000
| Type | Values Removed | Values Added |
|---|---|---|
| Metrics |
ssvc
|
Status: PUBLISHED
Assigner: mitre
Published: 2023-04-05T00:00:00.000Z
Updated: 2025-02-12T16:24:39.291Z
Reserved: 2023-04-05T00:00:00.000Z
Link: CVE-2023-29374
Updated: 2024-08-02T14:07:45.736Z
Status : Modified
Published: 2023-04-05T02:15:37.340
Modified: 2025-02-12T17:15:18.260
Link: CVE-2023-29374
No data.