Tactile Mobility’s Software Helps Smart Cars Feel the Road

Tactile Mobility’s new software is one of the industry’s latest attempts to improve the accuracy and safety of AV responses on the road
Photo: Pixabay

While there are quite a few autonomous vehicle startups, others exist to improve AV technologies. According to SiliconANGLE, Tactile Mobility released a new software module that lets AVs “feel” road conditions like a human, so they can operate and respond to bumps and other challenges more accurately and safely.


The Tactile Mobility solution

CEO Amit Nisenbaum claims the startup originated with the goal of improving AV operation and navigation to reduce fuel consumption and enhance the software and data systems that guide its responses on the road. He articulates how the new software model addresses a gap in the vehicle telematics field that he and his team noticed. “Everybody is doing Lidar and radar and cameras. [But there is…] a neglected segment with an additional set of sensors, that sense of tactility that all of us are using when we’re driving.”

Per SiliconAngle, the software gathers data from non-visual sensors. It then applies artificial intelligence models to interpret the data and uses this information to generate a real-time map of the road. The AV then has a more detailed internal map of the road it’s traversing, so it can provide passengers with a safer, smoother ride. “If you’re approaching a pothole, the vehicle would tune the suspension to become harder or softer. If the vehicle is approaching black ice, it will probably want to slow down,” said Nisenbaum


Other AV tech developments

M City opened on July 21, 2015.
Traffic collisions could be greatly reduced by driverless vehicle technology that allows them to read where other driverless vehicles are located.
Photo: UM News Service

Tactile Mobility isn’t the only one focusing on improving AV tech, though. Just last year, Microsoft and MIT collaborated on a new AI-based model to help AVs make better decisions. The model enables AV’s to compare its potential response with what a human driver would do in certain challenging driving scenarios. And the University of Michigan is exposing AVs to realistic driving scenarios in Mcity, a unique manmade city located on 16 acres on North Campus.

Be the first to comment

Leave a Reply

Your email address will not be published.