Get the latest tech news
Current AI scaling laws are showing diminishing returns, forcing AI labs to change course
AI labs traveling the road to super-intelligent systems are realizing they might have to take a detour. "AI scaling laws," the methods and expectations
When researchers give machine learning systems abundant resources during this phase – in which AI identifies and stores patterns in large datasets – models have tended to perform better at predicting the next word or phrase. To be clear, AI model developers will likely continue chasing after larger compute cluster and bigger datasets for pretraining, and there’s probably more improvement to eke out of those methods. “OpenAI’s new ‘o’ series pushes [chain-of-thought] further, and requires far more computing resources, and therefore energy, to do so,” said famed AI researcher Yoshua Benjio in an op-ed on Tuesday.
Or read this on TechCrunch