Get the latest tech news
Pretraining Language Models via Neural Cellular Automata
What if the path to smarter language models doesn't require more text — but synthetic data from abstract dynamical systems? Large language models are hungry. They require exponentially more data to keep improving, and high-quality natural language is projected to run out by 2028.
None
Or read this on Hacker News