In one in all my favourite motion pictures of all time, The Matrix, people change into the ability supply that retains machines alive.
Elon Musk should have watched that film lately, as a result of he simply pitched an analogous thought. Besides he needs idle machines to energy the way forward for intelligence, not the opposite manner round.
On Tesla’s current third-quarter earnings name, Musk floated this wild thought:
Truly, one of many issues I believed, if we’ve received all these vehicles that perhaps are bored, whereas they’re type of, if they’re bored, we may even have an enormous distributed inference fleet and say, in the event that they’re not actively driving, let’s simply have an enormous distributed inference fleet.
Translation: each idle Tesla may quickly act as a node in an enormous AI community. Tens of thousands and thousands of parked vehicles, pondering collectively.
However how would Elon’s cellular supercomputer work?
That’s the place issues get actually fascinating…
A Fleet That Thinks
Estimates fluctuate, however as of 2024, there have been round 5 million Teslas on the highway worldwide.
Elon Musk has a lot larger plans, predicting the fleet would possibly finally whole 100 million vehicles.
Right here’s what he stated throughout Tesla’s current earnings name:
Sooner or later, in case you’ve received tens of thousands and thousands of vehicles within the fleet, or perhaps sooner or later 100 million vehicles within the fleet, and let’s say they’d at that time, I don’t know, a kilowatt of inference functionality, of high-performance inference functionality, that’s 100 gigawatts of inference distributed with energy and cooling taken, with cooling and energy conversion taken care of. That looks like a reasonably vital asset.
In different phrases, 100 million Teslas, every able to about one kilowatt of high-performance inference.
That works out to roughly 100 gigawatts of compute energy.
To place that in perspective, 100 gigawatts is near the mixed output of 100 nuclear reactors or sufficient electrical energy to energy 75 million U.S. houses.
A single hyperscale information heart from Amazon Internet Providers or Google Cloud can draw 50 to 100 megawatts of energy. You’d want round 1,000 of these to match Musk’s theoretical 100-gigawatt community.
And all that potential computing energy would already be constructed, paid for and sitting in driveways.
Picture: Tesla
Tesla’s full-self-driving pc — referred to as {Hardware} 4 — is designed to strategy the type of efficiency seen in high-end information heart chips.
And a next-generation system known as AI5 is in growth that would ship a number of occasions extra processing energy, giving each Tesla the type of onboard compute as soon as reserved for information facilities.
What’s extra, every automobile already incorporates a high-performance processor and energy system able to working advanced AI duties. Every one already has a built-in thermal-management system that retains chips cool and batteries balanced. And each automobile is linked to Tesla’s cloud by means of the identical over-the-air replace community that pushes new software program and maps.
The distinction is, not like a server rack, these techniques spend most of their time doing nothing. As a result of the common automobile sits parked 95% of the day.
So Musk’s pitch is straightforward. Let’s put these idle processors to work.
In case you may borrow a bit little bit of power and compute from each parked Tesla, you might kind a world computing grid that might make at present’s cloud networks look far too centralized and inefficient by comparability.
Have to run an image-recognition mannequin, simulate an autonomous-driving state of affairs or course of video information?
Tesla may parcel out these jobs throughout thousands and thousands of vehicles in a single day.
This may give Tesla a possible moat that no different automaker — or cloud firm — may simply match.
In any case, GM and Ford don’t have proprietary chips just like the AI5 of their vehicles. And Amazon doesn’t have 5 million linked automobiles plugged into its cloud.
It will additionally assist shift AI from centralized supercomputers to distributed inference. That’s the identical type of edge computing mannequin that powers smartphones, drones and industrial robots at present.
As a result of on this state of affairs, the community wouldn’t must exist in a single central place.
It will stay wherever a Tesla is parked.
Right here’s My Take
If Musk can really execute on this wild thought, Tesla’s fleet may rival the most important AI compute clusters on Earth.
However there are hurdles to unravel earlier than it may change into actuality.
Operating inference jobs on automobile batteries may shorten their lifespan in the event that they aren’t managed rigorously.
Some house owners would possibly refuse to permit their automobile for use for Tesla’s compute work, even when they’re compensated. And data-privacy legal guidelines in Europe and California would require consent and transparency.
However Tesla already has expertise orchestrating large distributed techniques. Each time it updates Autopilot or trains new imaginative and prescient fashions, it collects and processes video information from thousands and thousands of vehicles worldwide.
The distinction right here is that Musk would need the Tesla fleet not simply to coach AI, however to run it.
On this future, Tesla’s vehicles would cease simply being automobiles and begin appearing as cellular computing belongings. House owners would possibly decide in by means of software program, permitting their automobiles to hire out compute cycles whereas parked, which might earn them credit or money in return.
For Tesla, it might be a wholly new income stream layered on prime of the present fleet. And like Musk’s robotaxi enterprise, it might scale routinely.
As a result of each new automobile bought would develop the community’s computing energy.
It’s a radical thought. And it might signify a radical shift for the corporate. If Tesla can pull it off, Musk may find yourself working the world’s strongest, most distributed AI community…
With out ever constructing a knowledge heart.
Regards,
Ian KingChief Strategist, Banyan Hill Publishing
Editor’s Notice: We’d love to listen to from you!
If you wish to share your ideas or recommendations concerning the Each day Disruptor, or if there are any particular matters you’d like us to cowl, simply ship an e-mail to dailydisruptor@banyanhill.com.
Don’t fear, we gained’t reveal your full identify within the occasion we publish a response. So be happy to remark away!




















