Get the latest tech news
Microsoft's BitNet shows what AI can do with just 400MB and no GPU
The BitNet b1.58 2B4T model was developed by Microsoft's General Artificial Intelligence group and contains two billion parameters – internal values that enable the model to understand...
The result is a model that dramatically reduces memory usage and can run far more easily on standard hardware, without requiring the high-end GPUs typically needed for large-scale AI. This extensive training allows BitNet to perform on par with – or in some cases, better than – other leading models of similar size, such as Meta's Llama 3.2 1B, Google's Gemma 3 1B, and Alibaba's Qwen 2.5 1.5B. In benchmark tests, BitNet b1.58 2B4T demonstrated strong performance across a variety of tasks, including grade-school math problems and questions requiring common sense reasoning.
Or read this on r/technology