AI meets little bobby tables
The attack technique developed by Pillar Researchers, which they call ‘Rules File Backdoor,’ weaponizes rules files by injecting them with instructions that are invisible to a human user but readable by the AI agent.
xkcd taught me anything, its to sanitize my inputs
Quote Citation: Laura French, “How AI coding assistants could be compromised via rules file”, March 18, 2025, https://www.scworld.com/news/how-ai-coding-assistants-could-be-compromised-via-rules-file