Get the latest tech news

Ethically trained AI startup Pleias releases new small reasoning models optimized for RAG with built-in citations


Pleias emphasizes the models’ suitability for integration into search-augmented assistants, educational tools, and user support systems.

Now the company has announced the release of two open source small-scale reasoning models designed specifically for retrieval-augmented generation (RAG), citation synthesis, and structured multilingual output. Pleias-RAG models are described as “proto-agentic” — they can autonomously assess whether a query is understandable, determine if it is trivial or complex, and decide whether to answer, reformulate, or refuse based on source adequacy. Looking ahead, Pleias plans to expand the models’ capabilities through longer context handling, tighter search integration, and personality tuning for more consistent identity presentation.

Get the Android app

Or read this on Venture Beat

Read more on:

Photo of citations

citations

Photo of RAG

RAG

Photo of Pleias

Pleias

Related news:

News photo

Transforming Your PDFs for RAG with Open Source Using Docling, Milvus, and Feast

News photo

Beyond Quacking: Deep Integration of Language Models and RAG into DuckDB

News photo

Beyond RAG: How Articul8’s supply chain models achieve 92% accuracy where general AI fails