What's new

Tesla driver killed while using autopilot was watching Harry Potter, witness says

No matter how good a machine is, it is always the person behind the machine who has to make the final decision. Sensors, Computers and all the high tech gadgetry can certainly help to give the human best possible information about the surroundings so that he can make the best decision. Bur the decision will always be with the humans. Machines can never replace human judgement.
if we give the decision power to the machines we are doomed and it is as good as signing your own death warrant.
 
.
Tesla Autopilot doesnt mean you can just sleep behind the wheel, they openly said you need to be aware of the situation and intervene if necessary, the article says he went fast into the intersection, everyone driving a car knows that its basically impossible to stop on time if you dont slow down.
Its like setting cruise control but not braking when there is something in front of you.

If he is on Autopilot, will A.I not be driving instead which means accelerating the pedal up to 130 mile per hour? Unless the driver is accelerating while leaving Autopilot engaged.
 
.
No matter how good a machine is, it is always the person behind the machine who has to make the final decision. Sensors, Computers and all the high tech gadgetry can certainly help to give the human best possible information about the surroundings so that he can make the best decision. Bur the decision will always be with the humans. Machines can never replace human judgement.
if we give the decision power to the machines we are doomed and it is as good as signing your own death warrant.

This open letter was announced July 28 at the opening of the IJCAI 2015 conference on July 28.
Journalists who wish to see the press release may contact Toby Walsh.
Hosting, signature verification and list management are supported by FLI; for administrative questions about this letter, please contact Max Tegmark.

Autonomous Weapons: an Open Letter from AI & Robotics Researchers
Autonomous weapons select and engage targets without human intervention. They might include, for example, armed quadcopters that can search for and eliminate people meeting certain pre-defined criteria, but do not include cruise missiles or remotely piloted drones for which humans make all targeting decisions. Artificial Intelligence (AI) technology has reached a point where the deployment of such systems is — practically if not legally — feasible within years, not decades, and the stakes are high: autonomous weapons have been described as the third revolution in warfare, after gunpowder and nuclear arms.

http://futureoflife.org/open-letter-autonomous-weapons

Your not the only one that thinks so
 
.
Yes but my contention is that it has been allowed too early on the roads..It needs a lot of testing ...For example the incident above is not a sabotage attempt by any but a failure of the auto-pilot due to its limited capabilities and vision. Thus people cannot be used as lab rats. Tesla needs to do a lot of more testing. A billionaire cannot just forge ahead. And such incidents will only cause decline in the sales and popularity not only Tesla but all the autonomous cars. There is already a strong lobby against electric cars and perhaps you know that Tesla is not allowed to be sold in many states.
The problem here was not just due to technical errors, but also due to driver error. The driver was preoccupied with watching a movie, despite the fact that Tesla has made it clear from the get go that it's autopilot is still very limited and must be kept an eye on by the driver.

I don't find Tesla's technology completely at fault here, not do I think it is too early for the tech to hit the road.
 
.
If he is on Autopilot, will A.I not be driving instead which means accelerating the pedal up to 130 mile per hour? Unless the driver is accelerating while leaving Autopilot engaged.
You set up the speed just like cruise control, if there is an obstacle it will slow down and keep the car on track but that doesnt mean you can just lean back and sleep, you still need to intervene if necessary.

A Autopilot of a plane can do everything autonomous including take off and landing but the pilots still need to stay concentrated which isnt always the case, there were already crashed because pilots trusted autopilot too much.
 
.
Autopilot update 8.0 coming in next few weeks...should address these kind of rare situations..BTW, this is the same guy that escaped an Accident just few months ago thanks to the AP! Level 4 Autonomy is still at least 2 years away even for Tesla.
 
. .
You set up the speed just like cruise control, if there is an obstacle it will slow down and keep the car on track but that doesnt mean you can just lean back and sleep, you still need to intervene if necessary.

A Autopilot of a plane can do everything autonomous including take off and landing but the pilots still need to stay concentrated which isnt always the case, there were already crashed because pilots trusted autopilot too much.

Except pilots don't engage Autopilot on for take off and landing. But for the rest, it is best left to the Autopilot engaged. It is on highway, and Autopilot is engaged, so it has to take input from driver on scheduled speed in advance? I assumed it is Autopilot with the power to be able to accelerate up to 100 miles per hour and more independently. A driver thought he was getting the advertised deal hence the risk. Even with A.I control, it has its own limitation which hasn't been addressed until the accident is brought to the light.
 
.
Ok.. but there are more teams in seals other than team 6 i persume

All the accidents seem to be happening with team 6 though
 
.
The Telsa autopilot is still in public beta mode, so Telsa won't get into much trouble.
 
.
Back
Top Bottom