Get the latest tech news
Reduce AI Hallucinations With This Neat Software Trick
A buzzy process called retrieval augmented generation, or RAG, is taking hold in Silicon Valley and improving the outputs from large language models. How does it work?
“Any lawyer who's ever tried to use a natural language search within one of the research engines will see that there are often instances where semantic similarity leads you to completely irrelevant materials,” says Daniel Ho, a Stanford professor and senior fellow at the institute for Human-Centered AI. Though, the Stanford research into AI tools for lawyers broadens this definition a bit by examining whether the output is grounded in the provided data as well as whether it’s factually correct—a high bar for legal professionals who are often parsing complicated cases and navigating complex hierarchies of precedent. “So, I think RAG is going to become the staple that is used across basically every professional application, at least in the near to mid-term.” Risk-averse executives seem excited about the prospect of using AI tools to better understand their proprietary data, without having to upload sensitive info to a standard, public chatbot.
Or read this on Wired