Humans trust other humans more than they trust computers and technology. We see it daily as people struggle to interact with technology, from “failure to adopt” to “problems keeping up”, people trust people, not computers. Which to me is very, very telling. Consider the number of people you actually trust. I bet you that circle is small. I bet some of the people you say you trust, you are actually skeptical of, or keeping them close because you, in fact, don’t trust them. So for a human to trust another human over a computer, telling!

A self-driving car was involved in fatal crash on a Sunday evening and the Arizona governor was banning the use of self-driving cars by Monday morning. Human-driven cars were involved in 8 fatal crashes on November 8, 2016, in Arizona, but they have yet to ban human powered cars. Matter of fact, they have yet to even enact distracted driving laws, making them one of THREE states lacking such laws. Thrillist ranked them the 17th most likely state to die in a car accident, and second only to the state of Florida in yearly pedestrian deaths. But according to the State of Arizona, self-driving cars are less fit for the road than a distracted driver.


March 2017

People trust people, not computers. Which to me is very, very telling. Consider the number of people you actually trust.


Let’s consider the recent Autonomous Uber accident; In which a woman was fatally struck by the autonomous vehicle while crossing the street with her bike. The vehicle did have a safety driver behind the wheel, who in the event of a malfunction or issue, could take control of the vehicle. Because again, humans trust humans more than technology. Except in this scenario, the human was unable to, for whatever reason, prevent this fatal accident. I could give you a million reasons why a human would not be able to take over from an autonomous vehicle in time to properly correct an oncoming accident, but I’ll just say distracted driving & leave you with this link.

According to police, the woman was crossing the street mid-block when she was struck. To which Tempe Police Sgt. Ronald Elcock reminded citizens to use crosswalks, “None of us ever want to go through this ever again, using crosswalks will definitely limit this from happening again.” To some this may come off as strange, a bit rude to the deceased, victim blaming; but to me, it simply signals he clearly believes following the laws will help save lives, and I would argue autonomous cars are better at following laws than humans. Computers are better at following the rules than humans, that is literally what they do.

So how did the accident happen? User error. You can only compute a program, and thus a self-driving car, to follow a certain (sometimes strict) set of rules. The rules of the roadways are set, and they say you should only cross the street in the crosswalk or yield the right-of-way to vehicles. Jaywalking has caught up plenty of people lately. Arizona professors & pedestrians alike.



So maybe humans don’t want computers taking life. Valid concern. But you can’t be on the side of the human in this trust fall if you are a stickler for the rules. You can’t be against the autonomous car if you too are annoyed by the jaywalkers who don’t yield to your green light. Who knows, maybe a little rule following by some, like autonomous cars; could lead to a lot of rule-following by all pedestrians & bikers on the roadway. If nothing else it’s better than beating the jaywalking out of people!