Get the latest tech news

EM-LLM: Human-Inspired Episodic Memory for Infinite Context LLMs


Contribute to em-llm/EM-LLM-model development by creating an account on GitHub.

In this work, we introduce EM-LLM, an architecture that integrates key aspects of human episodic memory and event cognition into LLMs with no fine-tuning, enabling them to handle practically infinite context lengths while maintaining computational efficiency. EM-LLM organises sequences of tokens into coherent episodic events using a combination of Bayesian surprise and graph-theoretic boundary refinement in an online fashion. Experiments on the LongBench and $\infty$-Bench benchmarks demonstrate EM-LLM's superior performance, consistently outperforming the SOTA retrieval model InfLLM across various baseline LLMs.

Get the Android app

Or read this on Hacker News

Read more on:

Photo of LLM

LLM

Photo of Human

Human

Related news:

News photo

Writing an LLM from scratch, part 13 – attention heads are dumb

News photo

Fine-tuning vs. in-context learning: New research guides better LLM customization for real-world tasks

News photo

Clippy resurrected as AI assistant — project turns infamous Microsoft mascot into LLM interface