Get the latest tech news

LLMs can be prompt-injected to give bad medical advice, including thalidomide to a pregnant woman


None

Get the Android app

Or read this on r/technology

Read more on:

Photo of LLMs

LLMs

Photo of thalidomide

thalidomide

Photo of bad medical advice

bad medical advice

Related news:

News photo

AI in Focus in 2026, Traders Look Past LLMs

News photo

Debug Mode for LLMs in vLLora

News photo

llamafile: Distribute and Run LLMs with a Single File