Get the latest tech news

Voters Increasingly Use AI as Political Advisor. A New Study Shows the Risks.


In an experiment during Japan’s February 2026 Lower House election, policy stances dominated AI chatbots’ voting guidance, and left-leaning stances caused five AI models to recommend the Japanese Communist Party. The results are driven by

None

Get the Android app

Or read this on r/technology

Read more on:

Photo of Risks

Risks

Photo of voters

voters

Photo of new study

new study

Related news:

News photo

BOE Warns on Escalating Risks From AI, Fallout From Iran War

News photo

Fed’s Barr Flags Stablecoin Risks As Agencies Ready Rules

News photo

AI chatbots are becoming "sycophants" to drive engagement, a new study of 11 leading models finds. By constantly flattering users and validating bad behavior (affirming 49% more than humans do), AI is giving harmful advice that can damage real-world relationships and reinforce biases.