Tag Archives: Uber

An Emperor’s New Clothes Tale about Uber and Their Autonomous Cars, Or, When I Began Driving Autonomously at Sixteen Years of Age

photo Uber's March 2015 "Not At Fault" Crash
Uber’s March 2015 “Not At Fault” Crash

When I began driving autonomously at sixteen years of age (i.e. without my father in the passenger seat guiding me and keeping me in compliance with the California driving laws for teens), Dad came home one night from work incredulous at the words of one of his business associates, who also had a new to driving kid.

Removing his tie, Dad shared with my mom, “So, Mike says to me: ‘My son at seventeen only driving eight months and already has six accidents. Six. But, none of them his fault.’ ”

My mother shook her head as my sisters and I listened in.

“Can you imagine?” my father was laughing incredulously now. “Six accidents! None of them his fault!”

 

I could not help but remember my father’s dinnertime story as I read about Uber’s latest auto-driving accident. Last week a car in Tempe Arizona and an automated Uber SUV collided in an intersection, and the Uber vehicle wound up on its side (photo above). Within hours all Uber cars on the roads in Pittsburgh, San Francisco and Arizona were pulled from the streets until the accident could be reviewed.

But within just a few days all those Uber vehicles were back on the roadways, because it was determined that the other driver failed to yield to Auto-Uber. The driver of the human driven vehicle was also cited for causing the accident.

I’m sure that the Uber owners are dancing gleefully at having been vindicated and proven innocent of any auto car mishap.

It wasn’t Uber’s fault.

 

How many more “none of them his fault” accidents will Uber endure before the Naked Emperor is unmasked?

Because all American drivers know the way probable scenario that actually occurred here:

On US roads, we human drivers are called on to yield to other drivers by little obscure right of way laws and by ubiquitous red (sometimes yellow) “YIELD” road signs. We human drivers know that, more often than not, even if we are the Yieldee, the Yielder is not actually going to yield to us. YIELD signs and laws are a nice sentiment, but Yieldees all basically ignore them, since we know that most Yielders will – in favor of avoiding crashes.

Uber can build all the actual traffic laws it wants into its autonomous vehicles, but it will never catch up with the unwritten laws, and law nuances, that all us humans know. Nor will it ever properly assimilate all the cultural laws specific to any city in the US. Those variations make it very difficult for even humans to drive safely – or without annoying the heck out of other drivers – when they attempt to drive in a new city.

Take for example, the Los Angeles rule of two cars waiting in the intersection and making their  left turn move on the yellow light. When a car sits behind the white line waiting for on-coming traffic to somehow miraculously disappear, we honk, knowing they must be “a damn LA foreigner.”

Some years back one of my nieces was driving down a country road in northern California when another car pulled up to a stop sign at a small crossroad. My niece continued on knowing that she had the “right of way.” Unfortunately, my niece was a young driver and had not yet assimilated the wisdom of her grandfather, who had warned all his descendants to always assume that every other driver on the road is stupid and about to do the unexpected. Sure, enough, the other driver made the incomprehensible decision to  pull onto the highway just as my niece’s car reached the crossroad; my niece and her car ended up in a tree (she was, miraculously, not injured). Can Uber and other auto-car makers build Other Driver Stupidity and Illogic into their computer code?

 

Okay, so back to Auto-Uber. Just about any American driver who read the news story about “not our fault” Uber was shaking her/his head afterward. We all know that one of the primary ways we avoid accidents on our roads is by knowing that few if any drivers will ever actually yield to us when they have a YIELD sign and/or the right of way. Get real! We all know that we must be prepared to yield to the yielder – even if we are legally the yieldee.

Auto-Uber may be in his legal rights. He may be dancing right now (can autonomous cars dance yet?), knowing he was let off the hook. But how many Auto-Uber  “not his fault” accidents need to occur before we wake up to realize how truly far away we are from safely sharing our human designed roadways and laws, and cultural nuances, with artificially intelligent vehicles?

 

 

 

Share

Self Driving Cars: the Emperor’s New Clothes Tale of Early 21st Century?

Image Self Driving Car
Self Driving Car

Yes, I am going out on a limb here. And if my words find any traction beyond my little corner of the Internet, surely there will be many people coming down hard on me. For there is BIG money behind this self driving car movement. Google is getting the most publicity for their efforts, but there are dozens of other companies staking their venture money (Tesla, Uber . . .) and their future reputations (Apple) in this zone.

Why am I dubious? Everyone KNOWS that Google and Apple and all the other Silicon Valley brainiacs can pull off just about anything, right? Who am I to doubt?

Well, I guess I’d say who I am is a woman who has a little more historical experience than some of the 20 and 30-something wunderkinds of San Fran, also a person who seems to have a bit more of a handle on the grey areas between the 1s and 0s logic of computer circuits. For starters.

Let me begin with something I tweeted recently (@TerryMediabench). It was in response to a tweet I read about how the Tesla Company was “discovering” (via accidents, I think it was) that human drivers cannot be counted on to be alert in all self-driving car emergencies when human intervention is called for.

Elon Musk Admits Humans Can’t Be Trusted with Tesla’s Autopilot Feature: http://www.technologyreview.com/view/543241/elon-musk-admits-humans-cant-be-trusted-with-teslas-autopilot-feature/

Here is what I responded on Twitter:

Triple A warned YEARS ago that speed control too dangerous due to drivers’ attn lapses. I suggest you read history #Tesla #Musk & #Google

I remembered the AAA warning vividly because it was sent to me by a friend who knew that I trekked up to San Francisco (hometown) from Los Angeles (climate of choice) on a regular basis and made frequent use of my car’s cruise control on the straight and boring Highway 5 that connects the two metropolitan areas. I took the warning seriously and have not used cruise control since.

If humans can’t be counted on to take back their car’s speed control in an emergency, while they are still doing the steering and managing every other automobile function, how can they possibly be expected to pay the necessary attention in any other car emergency situation after the car has taken over ALL driving functions? Let’s get real here.

And now let’s add the element of time to this scenario:

Emergency, in most cases, refers to something that is occurring in the space of very limited time. Like milliseconds. So try to imagine this driver having time to take back car control after putting down her/his phone, video game, nail file, harmonica, _______________ (fill in your preferred human auto-car personal activity), when the car suddenly cries out “Alert! human intervention needed. Now!”

Clearly, the auto-car coders were not aware of the Triple A cruise control warning history. What other history and human illogic are they missing-ignoring as they push forward with total confidence that they can not only pull off fully self driving cars, but do it in the next few years, as so many are touting?

I’m just getting started here.

Back later.

Share