Get the latest tech news
Scaling will never get us to AGI
A new result casts serious doubt on the viability of scaling
Neural networks (at least in the configurations that have been dominant over the last three decades) have trouble generalizing beyond the multidimensional space that surrounds their training examples. I have said this so often in so many ways, going back to 1998, that today I am going to let someone else, Chomba Bupe, a sharp-thinking tech entrepreneur/computer vision researcher from Zambia, take a shot. (Asterisk: current models aren’t literally look up tables; they can generalize to some degree, but not enough.
Or read this on Hacker News