VulnWatch VulnWatch
← Back to dashboard
Medium osv ยท GHSA-fprp-p869-w6q2

LangChain vulnerable to code injection

Published Apr 5, 2023 CVSS 4.0
In LangChain through 0.0.131, the `LLMMathChain` chain allows prompt injection attacks that can execute arbitrary code via the Python `exec()` method.

Affected AI Products

langchain