Get the latest tech news
KumoRFM: A Foundation Model for In-Context Learning on Relational Data
Foundation Models have completely taken over unstructured data domains like natural language and images, delivering significant advances in performance across tasks with little to no task-specific training. Yet structured and semi-structured relational data, which represent some of the most valuable information assets, largely miss out on this AI wave. Here, we present KumoRFM, a Relational Foundation Model (RFM) capable of making accurate predictions over relational databases across a wide range of predictive tasks without requiring data or task-specific training.
Foundation Models (FMs) have completely taken over unstructured data domains like natural language and images, delivering significant advances in performance across tasks with little to no task-specific training. To use AI on relational data, practitioners still use conventional machine learning approaches and build per-task and per-dataset specific models that require significant development and tuning time. Most importantly, KumoRFM is orders of magnitude faster than conventional approaches that rely on supervised training, and provides a zero-code solution to query any entity and any target at any future point in time.
Or read this on Hacker News