Uber has greater than 20 autonomous car companions, and so they all need one factor: information. So the corporate says it’s going to make that accessible by means of a brand new division referred to as Uber AV Labs.
Regardless of the identify, Uber is not returning to creating its personal robotaxis, which it stopped doing after one in all its check autos killed a pedestrian in 2018. (Uber finally bought off the division in 2020 in a advanced cope with Aurora.) However it should ship its personal vehicles out into cities adorned with sensors to gather information for companions like Waymo, Waabi, Lucid Motors, and others — although no contracts are signed simply but.
Broadly talking, self-driving vehicles are in the midst of a shift away from rules-based operation and towards relying extra on reinforcement studying. As that occurs, real-world driving information has turn out to be massively priceless for coaching these techniques.
Uber instructed TechCrunch the autonomous car corporations that need this information probably the most are those which have already been gathering quite a lot of it themselves. It’s an indication that, like most of the frontier AI labs, they’ve come to appreciate that “fixing” probably the most excessive edge instances is a quantity recreation.
A bodily restrict
Proper now, the scale of an autonomous car firm’s fleet creates a bodily restrict to how a lot information it will possibly acquire. And whereas many of those corporations create simulations of real-world environments to hedge towards edge instances, nothing beats driving on precise roads — and driving quite a bit — in the case of discovering all of the unusual, troublesome, and flat-out sudden situations that vehicles wind up in.
Waymo gives an instance of this hole. The corporate has had autonomous autos in operation or in testing for a decade, and but its present robotaxis have lately been caught illegally passing stopped college buses.
Accessing a bigger pool of driving information may assist robotaxi corporations remedy a few of these issues earlier than or as they creep up, Uber’s chief know-how officer Praveen Neppalli Naga instructed TechCrunch in an unique interview.
Techcrunch occasion
San Francisco
|
October 13-15, 2026
And Uber wont be charging for it. At the very least not but.
“Our objective, primarily, is to democratize this information, proper? I imply, the worth of this information and having companions’ AV tech advancing is way greater than the cash we will make from this,” he mentioned.
Uber’s VP of engineering Danny Guo mentioned the lab has to construct the essential information basis first earlier than it figures out the product market match. “As a result of if we don’t do that, we actually don’t consider anyone else can,” Guo mentioned. “In order somebody who can probably unlock the entire trade and speed up the entire ecosystem, we consider we’ve to tackle this accountability proper now.”
Screws and sensors
The brand new AV Labs division is beginning out small. Thus far, it simply has one automobile (a Hyundai Ioniq 5, although Uber says it’s not married to a single mannequin), and Guo instructed TechCrunch that his crew was nonetheless actually screwing on sensors like lidars, radars, and cameras.
“We don’t know if the sensor package will fall off, however that’s the scrappiness we’ve,” he mentioned with amusing. “I feel it should take some time for us to say, deploy 100 vehicles to the highway to begin gathering information. However the prototype is there.”
Companions received’t obtain uncooked information. As soon as the Uber AV Labs fleet is up and operating, Naga mentioned the division will “should therapeutic massage and work on the info to assist match to the companions.” This “semantic understanding” layer is what the driving software program at corporations like Waymo might be pulling from to enhance a robotaxi’s real-time path planning.
Even then, Guo mentioned there’ll possible be an interstitial step taken, the place Uber will basically plug a associate’s driving software program into the AV Labs vehicles to be run in “shadow mode.” Any time the Uber AV Labs driver does one thing totally different from what the autonomous car software program does in shadow mode, Uber will flag that to the associate firm.
This is not going to solely assist uncover shortcomings within the driving software program, but additionally assist practice the fashions to drive extra like a human and fewer like a robotic, Guo mentioned.
The Tesla method
If this method sounds acquainted, it’s as a result of it’s basically what Tesla has been doing to coach its personal autonomous car software program over the past decade. Uber’s method lacks the identical scale, although, as Tesla has tens of millions of buyer vehicles driving on roads world wide every single day.
That doesn’t hassle Uber. Guo mentioned he expects to do extra focused information assortment primarily based on the wants of the autonomous car corporations.
“We’ve got 600 cities that we will decide and select [from]. If the associate inform us a selected metropolis they’re all in favour of, we will simply deploy our [cars],” he mentioned.
Naga mentioned the corporate expects to develop this new division to some hundred individuals inside a 12 months, and that Uber needs to maneuver rapidly. And whereas he sees a future through which Uber’s entire fleet of ride-hail autos could possibly be leveraged to gather much more coaching information, he is aware of the brand new division has to begin someplace.
“From our conversations with our companions, they’re simply saying: ‘give us something that might be useful.’ As a result of the quantity of information Uber can acquire simply outweighs all the pieces that they’ll presumably do with their very own information assortment,” Guo mentioned.
