Get the latest tech news

How AI Coding Assistants Could Be Compromised Via Rules File


Slashdot reader spatwei shared this report from the cybersecurity site SC World: : AI coding assistants such as GitHub Copilot and Cursor could be manipulated to generate code containing backdoors, vulnerabilities and other security issues via distribution of malicious rule configuration files, Pil...

Slashdot reader spatwei shared this report from the cybersecurity site SC World: : AI coding assistants such as GitHub Copilot and Cursor could be manipulated to generate code containing backdoors, vulnerabilities and other security issues via distribution of malicious rule configuration files, Pillar Security researchers reported Tuesday. Hidden Unicode characters like bidirectional text markers and zero-width joiners can be used to obfuscate malicious instructions in the user interface and in GitHub pull requests, the researchers noted. Once the poisoned rules file is imported to GitHub Copilot or Cursor, the AI agent will read and follow the attacker's instructions while assisting the victim's future coding projects.

Get the Android app

Or read this on Slashdot

Read more on:

Photo of Rules

Rules

Photo of coding assistants

coding assistants

Related news:

News photo

FCC to get Republican majority and plans to “delete” as many rules as possible | Geoffrey Starks to leave FCC as new chair pushes "Delete, Delete, Delete" plan.

News photo

US appeals court rules AI generated art cannot be copyrighted

News photo

OpenAI asks White House for relief from state AI rules