Get the latest tech news
4 bold AI predictions for 2025
Enterprises can prepare for AI becoming faster, cheaper and more capable in 2025, and take the lead on developing applications.
The memory and compute bottleneck of transformers, the main deep learning architecture used in LLMs, has given rise to a field of alternative models with linear complexity. Other promising models include liquid neural networks(LNNs), which use new mathematical equations to do a lot more with many fewer artificial neurons and compute cycles. If progress in the field continues, many simpler LLM applications can be offloaded to these models and run on edge devices or local servers, where enterprises can use bespoke data without sending it to third parties.
Or read this on Venture Beat