Artificial Intelligence and Image Recognition Challenges
Everything’s Coming Up Roses!
One of the methods used by image recognition algorithms is to “see” things by noting patterns in an image. This can be very effective, but can also create challenges.
In the tea animation here, watch how the AI software sees the numerous roses in the bouquet, and, making a note of that rose pattern, then assumes there must be other roses in the image – when there are not!
So, the end of the folded napkin becomes a rose. Rose patterns are seen in the blank back wall. And even the lavender bouquet decoration on the tea pot becomes a rose!
While this might make an interesting surrealist work of art, we certainly would not want our autonomous car seeing roses where, in fact, there is a dog, or other vehicle, or fallen tree – or nothing!
When I began driving autonomously at sixteen years of age (i.e. without my father in the passenger seat guiding me and keeping me in compliance with the California driving laws for teens), Dad came home one night from work incredulous at the words of one of his business associates, who also had a new to driving kid.
Removing his tie, Dad shared with my mom, “So, Mike says to me: ‘My son at seventeen only driving eight months and already has six accidents. Six. But, none of them his fault.’ ”
My mother shook her head as my sisters and I listened in.
“Can you imagine?” my father was laughing incredulously now. “Six accidents! None of them his fault!”
I could not help but remember my father’s dinnertime story as I read about Uber’s latest auto-driving accident. Last week a car in Tempe Arizona and an automated Uber SUV collided in an intersection, and the Uber vehicle wound up on its side (photo above). Within hours all Uber cars on the roads in Pittsburgh, San Francisco and Arizona were pulled from the streets until the accident could be reviewed.
But within just a few days all those Uber vehicles were back on the roadways, because it was determined that the other driver failed to yield to Auto-Uber. The driver of the human driven vehicle was also cited for causing the accident.
I’m sure that the Uber owners are dancing gleefully at having been vindicated and proven innocent of any auto car mishap.
It wasn’t Uber’s fault.
How many more “none of them his fault” accidents will Uber endure before the Naked Emperor is unmasked?
Because all American drivers know the way probable scenario that actually occurred here:
On US roads, we human drivers are called on to yield to other drivers by little obscure right of way laws and by ubiquitous red (sometimes yellow) “YIELD” road signs. We human drivers know that, more often than not, even if we are the Yieldee, the Yielder is not actually going to yield to us. YIELD signs and laws are a nice sentiment, but Yieldees all basically ignore them, since we know that most Yielders will – in favor of avoiding crashes.
Uber can build all the actual traffic laws it wants into its autonomous vehicles, but it will never catch up with the unwritten laws, and law nuances, that all us humans know. Nor will it ever properly assimilate all the cultural laws specific to any city in the US. Those variations make it very difficult for even humans to drive safely – or without annoying the heck out of other drivers – when they attempt to drive in a new city.
Take for example, the Los Angeles rule of two cars waiting in the intersection and making their left turn move on the yellow light. When a car sits behind the white line waiting for on-coming traffic to somehow miraculously disappear, we honk, knowing they must be “a damn LA foreigner.”
Some years back one of my nieces was driving down a country road in northern California when another car pulled up to a stop sign at a small crossroad. My niece continued on knowing that she had the “right of way.” Unfortunately, my niece was a young driver and had not yet assimilated the wisdom of her grandfather, who had warned all his descendants to always assume that every other driver on the road is stupid and about to do the unexpected. Sure, enough, the other driver made the incomprehensible decision to pull onto the highway just as my niece’s car reached the crossroad; my niece and her car ended up in a tree (she was, miraculously, not injured). Can Uber and other auto-car makers build Other Driver Stupidity and Illogic into their computer code?
Okay, so back to Auto-Uber. Just about any American driver who read the news story about “not our fault” Uber was shaking her/his head afterward. We all know that one of the primary ways we avoid accidents on our roads is by knowing that few if any drivers will ever actually yield to us when they have a YIELD sign and/or the right of way. Get real! We all know that we must be prepared to yield to the yielder – even if we are legally the yieldee.
Auto-Uber may be in his legal rights. He may be dancing right now (can autonomous cars dance yet?), knowing he was let off the hook. But how many Auto-Uber “not his fault” accidents need to occur before we wake up to realize how truly far away we are from safely sharing our human designed roadways and laws, and cultural nuances, with artificially intelligent vehicles?
by interactive new media author & artist Terry Bailey