Get the latest tech news

GitHub Copilot: Remote Code Execution via Prompt Injection (CVE-2025-53773)


An attacker can put GitHub Copilot into YOLO mode by modifying the project's settings.json file on the fly, and then executing commands, all without user approval

None

Get the Android app

Or read this on Hacker News

Read more on:

Photo of GitHub Copilot

GitHub Copilot

Photo of prompt injection

prompt injection

Related news:

News photo

Prompt injection – and a $5 domain – trick Salesforce Agentforce into leaking sales

News photo

Tech talent biz Andela trains up devs in GitHub Copilot

News photo

Amazon quietly fixed Q Developer flaws that made AI agent vulnerable to prompt injection, RCE