Get the latest tech news

KumoRFM: A Foundation Model for In-Context Learning on Relational Data


Foundation Models have completely taken over unstructured data domains like natural language and images, delivering significant advances in performance across tasks with little to no task-specific training. Yet structured and semi-structured relational data, which represent some of the most valuable information assets, largely miss out on this AI wave. Here, we present KumoRFM, a Relational Foundation Model (RFM) capable of making accurate predictions over relational databases across a wide range of predictive tasks without requiring data or task-specific training.

Foundation Models (FMs) have completely taken over unstructured data domains like natural language and images, delivering significant advances in performance across tasks with little to no task-specific training. To use AI on relational data, practitioners still use conventional machine learning approaches and build per-task and per-dataset specific models that require significant development and tuning time. Most importantly, KumoRFM is orders of magnitude faster than conventional approaches that rely on supervised training, and provides a zero-code solution to query any entity and any target at any future point in time.

Get the Android app

Or read this on Hacker News

Read more on:

Photo of learning

learning

Photo of Context

Context

Photo of foundation model

foundation model

Related news:

News photo

Claude 4 benchmarks show improvements, but context is still 200K

News photo

Fine-tuning vs. in-context learning: New research guides better LLM customization for real-world tasks

News photo

Mem0’s scalable memory promises more reliable AI agents that remembers context across lengthy conversations