Get the latest tech news
AWS’ Trainium2 chips for building LLMs are now generally available, with Trainium3 coming in late 2025
At its re:Invent conference, AWS today announced the general availably of its Trainium2 (T2) chips for training and deploying large language models
At its re:Invent conference, AWS today announced the general availably of its Trainium2 (T2) chips for training and deploying large language models (LLMs). For Trainium3, AWS expects another 4x performance gain for its UltraServers, for example, and it promises to deliver this next iteration, built on a 3-nanometer process, in late 2025. And with our third-generation Trainium3 chips, we will enable customers to build bigger models faster and deliver superior real-time performance when deploying them.”
Or read this on TechCrunch