Unknown
osv
ยท
PYSEC-2023-18
PYSEC-2023-18
Published Apr 5, 2023
In LangChain through 0.0.131, the LLMMathChain chain allows prompt injection attacks that can execute arbitrary code via the Python exec method.
Affected AI Products
langchain