The Secret Fear Of The World's Biggest Auto Companies

My Nissan Leaf Forum

Help Support My Nissan Leaf Forum:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.
Hello,
Interesting since engineers and insurance companies have long known the leading cause of accidents is people. It's the magical disconnected thinking of most people that makes them paranoid about not being in control.
 
timhebb said:
Excellent point, which I'd overlooked in my previous defense of Google. But it reinforces the general notion that Google's so-called "failures" have resulted from social or legal issues, not technical ones.
I think it was probably 10 to 20 years ago that people blamed users for design failures. Surely not now ;)

If someone doesn't take into account social & legal issues, I'd consider it a big design failure.

Getting back on topic, I've never questioned the "technical feasibility" of auto pilot. Already happens on planes & trains, for eg. We already have cruise control, lane departure and other automations in cars.

The question really is about adoption. That is something very difficult to predict with any kind of accuracy.

BTW, if my automated car gets into an accident, who gets the ticket ?
 
evnow said:
timhebb said:
Excellent point, which I'd overlooked in my previous defense of Google. But it reinforces the general notion that Google's so-called "failures" have resulted from social or legal issues, not technical ones.
I think it was probably 10 to 20 years ago that people blamed users for design failures. Surely not now ;)

If someone doesn't take into account social & legal issues, I'd consider it a big design failure.

Getting back on topic, I've never questioned the "technical feasibility" of auto pilot. Already happens on planes & trains, for eg. We already have cruise control, lane departure and other automations in cars.

The question really is about adoption. That is something very difficult to predict with any kind of accuracy.

BTW, if my automated car gets into an accident, who gets the ticket ?
And there is one of the major societal barriers to adoption, especially in the litigious U.S. The five page Wired article I posted a link to a couple of pages back discussed this issue, among many others. It will take regulatory changes (such as we're already seeing in California - http://articles.latimes.com/2012/sep/25/business/la-fi-mo-self-driving-car-law-20120925" onclick="window.open(this.href);return false; ) as well as a considerable amount of time in the courts to work out just who's liable in an accident. Is it the car company, the software company if different, or is it still the driver?
 
evnow said:
timhebb said:
Excellent point, which I'd overlooked in my previous defense of Google. But it reinforces the general notion that Google's so-called "failures" have resulted from social or legal issues, not technical ones.
I think it was probably 10 to 20 years ago that people blamed users for design failures. Surely not now ;)
If someone doesn't take into account social & legal issues, I'd consider it a big design failure.
...
The question really is about adoption. That is something very difficult to predict with any kind of accuracy.
It seems we're quibbling about what constitutes a "design failure." When Edison tested 1,600 materials for light bulb filament until he found one that met his criteria (carbonized bamboo), were all the previous efforts "design failures?" Perhaps by your definition; not by mine, because they were necessary steps in the process of achieving success.

You may argue that Google's "failures" were certified as such because they represented products that "went to market," that is, were offered to the public. But I'd say it just represents a new way of doing business, in that Google almost certainly knew many pilot projects were doomed to "failure" - just not which ones. For them it is merely testing on the open market rather than in a closed sandbox. They even use the descriptor "Google Labs" as a header for their experimental products. Just seems to me your definition of failure is a bit stiff - and antiquated.

About autonomous vehicle adoption: over the longer term, I don't believe it will be a free consumer choice. We already know (those who follow the R&D fairly closely) that autopilot will prove significantly safer than human driving, statistically. Once the technology is legally sanctioned and widely available, insurance company actuaries will soon begin adjusting rates accordingly, and before long human drivers will be forced to pay a significant premium for the privilege of driving the old-fashioned way. If the disparity in safety is significant enough (I suspect it will be), insurance premiums will force the average consumer/driver to give up the wheel for good. Only the Rich and the Reckless (nice TV soap opera title) will still be cruising 20th century style, and self-driving libertines may be shunned and relegated to the same social outcast status that smokers and Hummer owners are today.

BTW, I'm not necessarily in favor of this dystopic future, but I can't see where else the rise of the autonomous vehicle leads to, and I do believe its rise is inevitable.
 
Back
Top