What's new

Just How Far Ahead Is Tesla In Self-Driving?

Hamartia Antidote

ELITE MEMBER
Joined
Nov 17, 2013
Messages
35,188
Reaction score
30
Country
United States
Location
United States
https://www.forbes.com/sites/greats...-ahead-is-tesla-in-self-driving/#37f136961b24

Autonomous driving cars have emerged as a hot buzz word in the automotive industry over the last few years, with companies ranging from mainstream automakers such as General Motors to Silicon Valley startups such as Waymo (backed by Alphabet) looking to make a dent in the market. However, electric vehicle pioneer Tesla (NASDAQ: TSLA) appears to have a sizable early lead in this space both in terms of autonomous miles driven as well as monetization of its self-driving technology. Having delivered over 780k vehicles since its inception, most of which come with pre-installed self-driving capabilities that users can unlock by paying for software, the company has developed a meaningful self-driving business. In this analysis, we compare Tesla’s miles logged with rivals and size up the near-term revenue potential for its autonomous driving software.


Tesla Is Approaching 2 Billion Self-Driving Miles Driven

Screen Shot 2019-11-08 at 9.45.22 PM.jpg

  • Tesla’s total autonomous miles logged has grown exponentially from 0.1 billion in May 2016 to an estimated 1.88 billion as of October 2019.
  • This is a crucial metric, as self-driving algorithms are based on machine learning, and more training data typically makes the algorithms smarter.

Tesla’s Log Of Autonomous Driving Data Is Orders Of Magnitude Higher Than Rivals

  • Over 2018, Tesla likely logged about 500 million self-driving miles across all geographies.
  • In comparison, rival autonomous driving tech companies Waymo and GM’s Cruise drove just 1.3 million and 447k miles, respectively, in California – their primary test market, which likely accounts for a bulk of their total miles logged.
Tesla’s Lead May Be Wider Still, As It Continuously Gathers Data From All Its Vehicles

  • Tesla’s autonomous driving hardware is based on mature technology such as Radar, Ultrasonic, and Passive video, which is cheaper than some rivals who use LIDAR – a laser-based system.
  • This enables the company to equip the hardware as standard in all its vehicles, irrespective of whether or not a user enables it by paying money.
  • As the company’s vehicles are estimated to have driven over 16.8 billion miles in total thus far, this could be further enhancing Tesla’s log of driving data.
Tesla Is Likely To Make Over $1.5 Billion This Year From Self-Driving Software Sales

For more details on Tesla’s self-driving software sales, view our interactive dashboard analysis.
 
.
Hyundai to test self-driving vehicles as shuttle service in Irvine
OCR-L-UCIAUTONOMOUS-1025-04-1.jpg

Hyundai is set to launch a ride-sharing program with autonomous SUVs in Irvine. (Courtesy of Hyundai)
By ALICIA ROBINSON | arobinson@scng.com | The Orange County Register
PUBLISHED: October 25, 2019 at 8:02 am | UPDATED: October 25, 2019 at 5:41 pm


Hyundai is about to begin road-testing a free, autonomous ride share service that will shuttle passengers between 13 destinations around Irvine.

Called BotRide, it’s set to launch Nov. 4 and will use a smartphone app through which people can find a nearby stop and request a ride in a self-driving Hyundai KONA electric SUV during pre-set hours of operation.

  • OCR-L-UCIAUTONOMOUS-1025-01-1.jpg
  • OCR-L-UCIAUTONOMOUS-1025-06-1.jpg
  • OCR-L-UCIAUTONOMOUS-1025-05-1.jpg
  • OCR-L-UCIAUTONOMOUS-1025-06-1.jpg
A human driver and passenger will serve as safety backups in Hyundai’s self-driving ride-share program, which will be tested in Irvine over three months. (Courtesy of Hyundai)

The three-month Hyundai pilot program, which will run through the end of January, is one of only four the state has permitted to carry passengers in autonomous vehicles, although 64 companies are allowed to test self-driving cars with a human in the driver’s seat in California.

The SUVs in Irvine will have not one, but two humans in front for safety, said Daniel Han, advanced product strategy manager for Hyundai Motor USA, in an interview Thursday, Oct. 24.


The person in the driver’s seat will be able to take control of the car at any time if needed, and the front passenger will check the surroundings and driving conditions against a tablet that reflects what the car’s various cameras and sensors are picking up.

Tech companies have been testing autonomous vehicles in California and other states for several years, but the California Public Utilities Commission first granted permission for companies to transport riders in December 2018.

Hyundai partnered with Pony.ai, which created the self-driving technology being used in Irvine, and Via, maker of the mobile app for BotRide, for its pilot ride-share service.

The commercial roll out of autonomous cars with no human backstop is probably years away, and even then Hyundai doesn’t expect people will stop owning and driving their own cars, Han said, but the company is preparing to offer options that suit a range of needs.

“We think that there’s going to be a lot more choices for consumers to get around,” Han said.


That’s one reason the company chose Irvine. BotRide is being marketed to UC Irvine students, who may not have cars or if they do, struggle to find parking, Han said. The company also is collaborating with researchers in the university’s business and engineering schools.

Nick Shaffer, director of external relations for the UCI Paul Merage School of Business, called the collaboration “an opportunity for our school and students to be on the forefront of digital transformation.”

“As we prepare our students to be leaders in the digitally driven world, this immersive experience allows them to gain first-hand insight into how technology is disrupting the business landscape.”

Irvine was also picked as an “ideal suburban setting” that also “represents a large swath of the United States,” Han said.

But the city has a reputation as one of the safest in the nation, and Irvine Councilman Michael Carroll said he feared the autonomous pilot program could put that status at risk.

“These cars are a dangerous novelty, they have caused fatal accidents, and accidents resulting in serious injury,” Carroll said after learning about the program at City Council meeting. “I am not willing to gamble the lives of our residents and our children and our families to support a technology that nearly two-thirds of Americans say they would not even buy, according to a recent Reuters poll.”

Since 2016, news reports note a handful of deaths involving self-driving cars in the U.S., most of them Teslas.

Of 73 autonomous vehicle crash reports in California so far this year, two involved Pony.ai systems. Both were minor fender-benders and in neither case did the self-driving vehicle appear to be at fault.

Irvine officials noted the city has no authority in permitting autonomous cars using city streets, and companies are not required to notify communities where they plan to test vehicles.

Hyundai officials said safety is their top priority, and the vehicles being tested in Irvine have multiple redundant systems that gather real-time information such as high-definition maps, cameras and radar in addition to the two humans in every car. The cars will only use surface streets and won’t go on freeways.

Hyundai spokesman Miles Johnson said the 10 BotRide SUVs, which are red with yellow and white detailing, will stop at Irvine City Hall, several apartment complexes and commercial destinations such as Crossroads Plaza, Trade Food Hall and Culver Plaza.

After the Nov. 4 launch, people will be able to go online to pre-register to become riders. The company will open the program to batches of riders with the goal of getting several hundred signed up, Johnson said.

While university students and staff are the program’s target audience, Han said ultimately it will be open to anyone living in Irvine.

“We spent a lot of money and a lot of man hours to bring this project to life,” he said. “So naturally our goal is to have a lot of ridership.”

Editor’s note: This story has been updated to correct the number of crashes involving autonomous vehicles in 2019.
 
. .
I’m still waiting for my Jetsons car
 
.
I’m still waiting for my Jetsons car

Why? I thought you just bust outta the ceiling and fly where you wanna.

At least go by a regular normie name like Clark Kent or something bud if you wanna disguise well. SHEESH. Amateur aliens.
 
.
Why? I thought you just bust outta the ceiling and fly where you wanna.

At least go by a regular normie name like Clark Kent or something bud if you wanna disguise well. SHEESH. Amateur aliens.

Even a demi-god likes to chill out :bounce:
 
.
https://electrek.co/2020/06/18/tesla-approach-self-driving-harder-only-way-to-scale/

Tesla admits its approach to self-driving is harder but might be only way to scale


Tesla-Autopilot.jpg


Tesla’s head of AI admitted that the automaker’s approach to self-driving is harder than what most companies in the industry are doing, but he says it’s the only way to scale.

There are dozens of high-profile companies working on solving self-driving and virtually as many different approaches, but there are two main differences: those who rely mainly if not entirely on computer vision, and those who rely on HD mapping.

Tesla falls in the former category of relying on computer vision.

Andrej Karpathy, Tesla’s head of AI and computer vision, is leading this effort.

Earlier this week, he participated in a CVPR’20 workshop on “Scalability in Autonomous Driving” during which he gave an update on the status of Tesla’s program and talked about the scalability challenges:

During the presentation, Karpathy shared a video of Tesla’s self-driving development software demonstration doing a turn and then Waymo’s self-driving prototype doing the same.

He highlighted how it looks exactly the same, but the decision making that is powering the maneuver is completely different:

Waymo and many others in the industry use high-definition maps. You have to first drive some car that pre-maps the environment, you have to have lidar with centimeter-level accuracy, and you are on rails. You know exactly how you are going to turn in an intersection, you know exactly which traffic lights are relevant to you, you where they are positioned and everything. We do not make these assumptions. For us, every single intersection we come up to, we see it for the first time. Everything has to be solid — just like what a human would do in the same situation.

Karpathy admits that this is a hard problem to solve.

However, the engineer explains that Tesla aims for a scalable self-driving system deployable in millions of cars on the road, and he argues that Tesla’s vision-based system is easier to scale:

Speaking of scalability, this is a much harder problem to solve, but when we do essentially solve this problem, there’s a possibility to beam this down to again millions of cars on the road. Whereas building out these lidar maps on the scale that we operate in with the sensing that it does require would be extremely expensive. And you can’t just build it, you have to maintain it and the change detection of this is extremely difficult.

The engineer described the map-based approach as a “non-scalable approach.”

He did say that Tesla also builds maps and use “all kinds of fusions between vision and the maps,” but their maps are not centimeter-level accurate and therefore, they can’t rely on them to navigate.

Tesla has to be able to handle any situation like it is seeing it for the first time.

Karpathy explains how they accomplish that with only “a few dozen people” working on neural networks.

Everything is built around a general computer vision infrastructure around which they, in turn, create new tasks. While only a few dozens work on neural networks, they have a “huge” team working on labeling.

In other words, they separate the core vision detection system and the separate tasks that the system needs to achieve, like detect all types of stop signs.

The engineer had some words for the competition relying on maps:

Do not assume that we can get away as an industry with HD lidar maps for global deployment of these features. I would take lidar maps, and especially the flow of all the lanes, traffic, and so on, and think about how you can predict an intersection without assuming lidar maps.

You can watch the full presentation via the link above
 
Last edited:
.

Military Forum Latest Posts

Back
Top Bottom