Get the latest tech news

SpikingBrain 7B – More efficient than classic LLMs


Contribute to BICLab/SpikingBrain-7B development by creating an account on GitHub.

Inspired by brain mechanisms, SpikingBrain integrates hybrid efficient attention, MoE modules, and spike encoding into its architecture, supported by a universal conversion pipeline compatible with the open-source model ecosystem. We further adapt frameworks, operators, parallel strategies, and communication primitives for non-NVIDIA (MetaX) clusters, ensuring stable large-scale training and inference. The current implementation adopts pseudo-spiking, where activations are approximated as spike-like signals at the tensor level, rather than true asynchronous event-driven spiking on neuromorphic hardware.

Get the Android app

Or read this on Hacker News

Read more on:

Photo of classic LLMs

classic LLMs

Photo of SpikingBrain 7B

SpikingBrain 7B