Some day, hopefully soon, I’ll have a Tesla. With this thought in mind I tend to find myself contemplating Autonomous Vehicles (AVs), most often while manually driving my current car.
What really gets me thinking is when I’m driving down a road that’s a bit off the beaten path. You know those windy and hilly roads you experience as you get further from the city? The ones with minimal shoulders, poor lane markings, and nothing but nature flashing by as you drive. Add a nice nighttime rainstorm and suddenly you realize just how far we have to go before our driving experience is truly hands free.
Autonomous vehicles are at a disadvantage
If you think about how we drive today it becomes clear that humans use an assortment of tools to direct our vehicle where we want it to go. We use a variety of signs, colors, lights, and markers to find our way. Symbols and methods that have been honed over many years of practice and adaptation—standardized, socialized, and used as test criteria to prove our very ability to navigate from point A to point B.
With AVs, we are trying to teach machines how to interpret signs and environments that have been built specifically for the human experience. Their design, content, and very location have been strategically developed for human consumption and understanding.
I mean, it makes sense of course. When they placed the first stop sign at the corner of two roads in Michigan in 1915, self-driving cars weren’t top of mind. However, there’s no doubt we are putting the machines at a distinct disadvantage now as we’re asking them to understand our world.
Moving beyond vehicle-centric models
Much of the focus to date for self-driving cars has been vehicle-centric. Trying to make the car smart enough to see the environment around it and make sub-second decisions based on what it believes is happening outside. Perhaps instead we need to make it a bidirectional relationship and employ the surrounding environment as an active participant.
This means augmenting and modifying the infrastructure meant to guide the human driver to now actively guide the digital driver. Put another way, we need to help the self-driving car “see” better.
Other ways to help autonomous vehicles
To establish a more efficient and effective driving environment for AVs we have to think of ways that we can help the cars better understand what’s going on around them. We need to put ourselves in the mind of the machine and try to figure out how we can make their job easier.
Why can’t we deploy signs and sensors throughout our cities and roadways that provide input and feedback to autonomous vehicles? They don’t have to be overly complex and connected Internet-of-Things (IoT) devices but rather simple transmitters or signage that can easily be read and processed by computers. Imagine what the vehicle sees when it’s trying to interpret the surrounding environment.
What if there were signs with QR codes or other symbology designed specifically to aide computers in reading and interpreting information.
What if physical landmarks or signs were specifically designed to be read by LiDAR devices, effectively creating a vehicle sensory effect similar to braille.
What if the painted lines or road surfaces had other communication methods directly embedded that allowed cars to map roadways without solely relying on observation and forms of computer vision?
What if construction projects required the installation of a simple beacon, emitting a unique ID, at the corner of every intersection? This would allow AVs to know exact parameters of the intersection instead of having to decipher it on the fly.
What if stop lights could transmit their current color and remaining time before state change?
Where do we go from here?
While we have contemplated some of this technology in the past, and much of it remains aspirational, it’s worth thinking about how we can start to help the evolution of autonomous vehicles with complementary approaches. The barrier to entry for many of the simpler technologies is so low that maybe it merits further near term analysis and testing.
Rather than standing by and hoping the digital vehicles of the future can continue to learn and adapt to our traditional ways of communication, perhaps we should reassess and think more like the machine across the spectrum. Thus enabling a more active and collaborative relationship between the machine and its surroundings.