The Accelerating Complexity of AI Models

“The number of parameters in a neural network model is actually increasing on the order of 10x year on year. This is an exponential that I’ve never seen before and it’s something that is incredibly fast and outpaces basically every technology transition I’ve ever seen… So 10x year on year means if we’re at 10 billion parameters today, we’ll be at 100 billion tomorrow,” he said. “Ten billion today maxes out what we can do on hardware. What does that mean?”–Naveen Rao, Intel