Other Perspectives on AVs

gm_firebird_iii_conceptWe’ve posted many articles over the years on the emerging technology that will someday make “Autonomous Vehicles” a reality.

Most traffic safety professionals see this as a hopeful sign that traffic fatalities could be reduced dramatically since 90% of crashes are tied to driver’s choices, attitudes, and actions while behind the wheel.

road train automatedStill, not everyone is comfortable with an 80,000 pound tractor trailer (aka ground based “drone”) hurtling down the highway on its own — destined for some far distant warehouse or retail center.

It’s easy to appreciate the trepidation of folks who, in their mind’s eye, see robot trucks running amok, creating devastation and putting CDL drivers out of work.

Recently, an editorial titled “I’ll Keep My Steering Wheel, Thank You Very Much” appeared (click HERE for full story) summarizing both the ongoing advancement in technology and the concerns of “what if” the technology isn’t everything we hoped that it would be….from the article:

I’ll say right now that I’m not too pleased with where these folks think things are heading, especially as a majority seem to believe critical components such as steering wheels, throttle and brake pedals are going to disappear by 2035…Hey, I WANT a steering wheel in a vehicle – and a brake pedal for that matter! “Autonomous” technology may be all well and good for buses, trains, and the like, but if I want to go somewhere in my car I actually want to be able to drive it there!

Now, I’ll be the first to acknowledge that the author was being a bit sarcastic to make a point — we may not be comfortable with the thought of what’s coming down the road in the future, but we’re not there yet, either.

Over time, we have been witnessing the gradual introduction and fine tuning of individual technologies that “assist” drivers with safe or efficient driving (i.e. lane departure, forward looking radar, nighttime vision assist, movable headlamps, etc).  Over the next couple decades, we’ll see these independent systems linked and become more readily available in all vehicles as a “standard” feature instead of a “Buck Rogers” add-on.

What do you think?  Are your ready for vehicles with no steering wheel?  Or are you uncomfortable with that notion?

Safety Features of Autonomous Vehicles (AVs)

Autonomous Vehicles (AVs) or “self-driving cars” incorporate a lot of high tech and cutting edge technology in order to be able to drive down the road in a safe and predictable manner.

A recent artticle at “Tech Page One” (Link HERE) did a great job summarizing key safety features of AVs and included a wonderful “infographic” as well.

Here is a short list of several features:

  • LIDAR, and acronym for Light Detection and Ranging, uses sensors to collect data about the position of the AV in relationship to other, nearby, objects.  From the article; “A LIDAR device consists of a laser, a scanner and a specialized GPS receiver…Typically, airplanes and helicopters use this technology because it’s known for precision and accuracy in mapping nearby targets. In driverless cars, LIDAR automatically controls the steering, power delivery and braking…”
  • Adaptive cruise control & brake assist are two functions that help AVs control their speed as they go up and down hills, around curves, enter into areas of congestion (stopped traffic) and such.  Emergency brake assist is used when the AV is in danger of hitting a stopped object, or if it is overtaking another object too quickly.
  • Specialized cameras help AVs to recognize pedestrians and cyclists in real time so that the computer can adjust speed or travel path as needed.
  • Preloaded, detailed mapping helps the AV know where it is located in relationship to buildings, roads, and other fixed objects.

Self-Driving Cars — When? (Maybe 2017)

A recent update report from C/NET (click HERE) summarizes amazing progress completed by Volvo on its efforts to produce (and sell) a “level 3” self-driving car by 2017.  It’s ultimate goal is to produce a “nearly uncrashable car” by 2020.

A “level 3” car can navigate along a designated road by itself with a human pilot in the driver’s seat “just in case”.  A “level 4” car is so completely automated that a human pilot is not required (think about curling up in the back to take a nap, etc.)

Of course, many people are not sure they’d be ready to yield so much trust to an Autonomous Vehicle (AV).  So why are so many car manufacturers pushing forward on this concept?  There are probably as many reasons as design teams, but as reported in the C/NET article, for Volvo it is all about safety and it’s desire to see a crash free world:

“Human error is behind almost all crashes,” Anders Eugensson, Volvo Cars’ director of government affairs said. It’s at least partly responsible 95 percent of the time, either thanks to negligence (drunk driving, distracted driving, falling asleep, etc.) or simply because a driver failed to avoid a preventable accident. If you can eliminate driver error you can eliminate nearly all accidents.

According to C/NET, Volvo has a head start towards its production model, level 3 AV:

Volvo already manufactures cars that have all the laser, radar, sonar, and visual sensing equipment needed for autonomous driving. It makes up the company’s City Safety program, currently available in the US as part of a $2,100 technology package. A forward looking camera and laser scanner are built into a pod on the windshield, tucked behind the rear-view mirror, while a radar system lives in the nose, hidden beside the company’s unapologetically masculine logo.

The sensor package that enables City Safety is just the latest of a long list of safety innovations that reach back to the beginning of the company. Laminated glass, three-point seatbelts, side-impact airbags, whiplash-preventing headrests… all things that Volvo invented or adopted as standard equipment well before the rest of the industry.

Of course, Volvo’s team acknowledges that there is a substantial (but not insurmountable) gap between deploying a level 3 and level 4 production AV.  In fact, they estimate that an additional two decades’ worth of work will need to take place.  Still, they’re on a mission — to reduce crashes.

But what if a crash does happen?  Who would be responsible?

Who’s at fault when a self-driving car crashes? While all the other manufacturers are busy shrugging their shoulders, Volvo has made its position on this quite clear: when the car is being manually driven, the driver is at fault in an accident. But, if the car is in autonomous mode and causes a crash, Eugensson said Volvo will take responsibility. “It will be difficult to sell if the driver is still liable. It gives a false promise.” One needn’t be a talking lizard to know this should result in cheaper insurance premiums.

If you’re still interested in learning more about AV technology, we’ve published multiple posts about this, here at our blog site — just use the search function located at the top of the page.

Autonomous Cars and “Ethical Crashing Algorithms”

Connected CarsIn a recent article titled “The Problem with Self-Driving Cars: They Don’t Cry” (click HERE), the author investigates the question of how an artificial intelligence would make ethical choices about crash scenarios.

Citing a study by Noah Goodall (“Ethical Decision Making During Automated Vehicle Crashes”) (click HERE), the author asks “can robot cars be taught to make empathetic, moral decisions when an accident is imminent and unavoidable?”  In the news article, the scenario presented to illustrate the concern is as follows:

Consider a bus swerving into oncoming traffic. A human driver may react differently than a sentient car, for example, if she noticed the vehicle was full of school kids. Another person may swerve differently than a robot driver to prioritize the safety of a spouse in the passenger seat.

In my mind, it’s much simpler scenario.  A self-driving car is taking me home at twilight.  I may not see as well during low light transition hours such as dusk or dawn, but the super car has RADAR and special sensors which detect movement on the road ahead — is it a small child crossing the road?  Is it a roving raccoon or a dancing deer?  I would want the car to avoid the crash, but…

  • The tactics of swerving to avoid a child put me at great risk of rolling the car over and causing extensive injuries to me.  
  • The tactics of trying to avoid a raccoon/deer/antelope may include hitting the brakes and hoping we don’t crush the animal, but I come away unhurt and without extensive damage to my very expensive robot car (rolling the car is much worse than a deer-strike in most cases).

Will a robot car be able to distinguish between a child and a wild animal?  If it could, will it react differently?  Would most people react differently?

It is my hope that the onboard sensors enable a much earlier alert and greater chance of avoiding either collision.  Still the author makes an interesting point in stating “There is no obvious way to effectively encode complex human morals in software.”  Further, the analysis offers:

According to Goodall, the best options for car builders are “deontology,” an ethical approach in which the car is programmed to adhere to a fixed set of rules, or “consequentialism,” where it is set to maximize some benefit—say, driver safety over vehicle damage. But those approaches are problematic, too. A car operating in those frameworks may choose a collision path based on how much the vehicles around it are worth or how high their safety ratings are—which hardly seems fair. And should cars be programmed to save their own passengers at the expense of greater damage to those in other vehicles?

Suggesting that the human driver be able to override the AI decision making is an interesting comment offered, but most people’s reaction times are much slower during crashes than a computer’s ability to reach logical conclusions; therefore, it’s suspect that a person could realistically intervene in most cases.

The path forward is complex, but not so difficult that we would slow or stop the development of self-driving cars.  We would certainly hope that engineers, regulators, scientists and other affected professionals in the transportation world would be thinking about how to reconcile these areas of concern.

Our brave new future of hands-free driving sounds great when we consider that the vast majority of crashes could be avoided, but it’s troubling to think about the possible crash scenarios that have ethical implications.

road train automated