My time studying and engaging in philosophy has had me considering more than my fair share of different perspectives on the same underlying phenomena, but even I scratched my head upon hearing Nvidia CEO Jen-Hsun Huang's latest take on AI. This being that AI is generated just like electricity, which implies it's a resource or commodity.
To be more specific, in a recent earnings call (via Motley Fool) the Huangster stated: "Just like we generate electricity, we're now going to be generating AI. And if the number of customers is large, just as the number of consumers of electricity is large, these generators are going to be running 24/7."
It sounds like marketing hype, but if so then Huang is quite committed to the bit. He continued: "Today, many AI services are running 24/7, just like an AI factory. And so, we're going to see this new type of system come online, and I call it an AI factory because that's really as close to what it is."
Okay, but maybe it's just hyperbole, you know, something tha-
"When we say generative AI, we're essentially saying that these data centers are really AI factories. They're generating something."
Okay, so it's a serious claim, then. Let's give the devil his due.
The knee-jerk reaction is to say there's no way AI is similar to electricity as a kind of commodity. This reaction's probably best encapsulated by the absurd image of going to top up your AI card at the store and then popping it into your AI meter at home to get your AI back online.
But that's not really fair, is it? It's a bit of a caricature of Huang's position which, to be honest, does make at least some sense. "Resource" can mean money, materials, capital, and even people (you know "human resources"). "Commodity" can mean anything that's traded, bought, or sold.
In this sense, pretty much anything can be a resource or commodity, and the scandalous huckster in me wants to agree: "There's a price for everything", I say, smiling to reveal a glimmer from my golden tooth.
If money's just an abstract middleman for bartering actual resources, the proverbial Huangian might claim that these resources themselves are surely no less ephemeral and up for debate. In a hypothetical world—call it "Twigland"—where people have an insatiable craving for twigs, particularly twigglesome trees would be resource number one. It all comes down to what we value.
And Huang could have his finger on the pulse regarding what we're coming to value, this being AI. Huang says that we're seeing the beginning of a "widespread" shift that sees us "moving from coding that runs on CPUs to machine learning that creates neural networks that runs on GPUs", and that "there are no companies who are not going to do machine learning".
In which case, AI compute could become so ubiquitous that it'd be considered as twigs to Twiglanders. If CPU processing really does become second-rung to machine learning for neural network processing, such a conclusion would make sense.
Best CPU for gaming: The top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game ahead of the rest.
But even so, would it be a resource like electricity?
If we say yes then where do we draw the line? It seems there'd be no meaningful distinction between a "resource" like electricity and anything else we value. Surely resources have to have some productive weight behind them—you know, to be far back enough in the chain that they're used as resources for multiple other things, like how electricity powers my computer as well as the traffic lights.
But then maybe Huang would say that's exactly the point: AI is going to be used for so many vitally necessary things: research, medical treatments, home computing, and so on.
Perhaps the only way to win the argument would be to cut off the power to the AI data centres. We'll see how AI stacks up against electricity then, Mr. Huang.