When people first learn about the concept of autonomous cars, right away, most of them intuit the system’s incredible reliance on data.
The car needs to be in constant communication with location-tracking satellites, for instance, and have the ability to send and receive messages from other vehicles on the road.
Whether it’s to find a destination or swerve around an unexpected obstacle, everyone knows that self-driving cars must be constantly hoovering up data from the outside world, and often feeding that data to advanced neural network algorithms to sift meaning from it in real-time.
Still, as amazing as those outward abilities are, what few people realize is that these vehicles may actually gather just as much data from inside the car as from outside of it.
Passengers in tomorrow’s autonomous vehicles will be subject to the focused attention of an advanced vehicular AI, and in many ways the quality and safety of their ride will be dictated by the vehicle’s ability to interpret human wishes and needs.
Traditional cars may run on gas, but autonomous cars run on data, and they’ll mine that data from anywhere they can.
In the future, you will talk to your car
The most obvious form of autonomous cars data input is intentional — voice commands.
This isn’t quite as easy as it might seem, since right now almost all of the speech recognition algorithms require a cloud connection to translate audio at real-time speeds.
Voice control will finally turn the car into the hands-free robot chauffeur we’ve always dreamed of, but at present it’s just too primitive to function as the primary form of vehicular control.
To fulfill the potential of the autonomous car, we’ll need to improve either the speed and reliability of mobile data connections, or the speed and price of powerful onboard vehicular computers.
Thankfully, both of those figures are improving quickly, along with the efficiency of the algorithms in question.
This means that soon owners of autonomous cars could be able to say “take me home” to their dashboard, and have it not only identify their desired destination, but drive there without further question.
More advanced versions could even hear implied commands, such as the implicit command to turn around within the exclamation, “I forgot my wallet!”
With more advanced, modern speech recognition, cars could even manage to understand these sorts of outbursts through the slur of inebriation, making a night out on the town both safer and more convenient.
Autonomous cars data will be attentive not just to words, but actions as well
Passengers in autonomous cars communicate through more than their intentional voice commands.
In reality, involuntary communication may have just as much to teach a car about what a passenger wants.
There are extreme examples, such as if a passenger loses consciousnesses and needs the car to decide on its own to head for a hospital, but the utility of an attentive robot car goes far beyond safety.
Sufficiently aware software could detect inebriation in a passenger and require self-driving to remain in effect, thus preventing any driving while under the influence of intoxicating substances — even those that can’t be detected on a breathalyzer.
A car might be able to notice a passenger’s preference for a slightly slower, smoother ride to a destination, or it could see from their constant glancing at the time that they would prefer a slightly faster, more aggressive path through traffic.
And while cars will “learn” the most about best driving practices from looking outward at the cars around it, the reactions of passengers inside (say, boredom at a slow turn or distress at a fast one) could inform its future behavior, as well.
Autonomy means talking back, as well
Most of the communication between human and vehicle flows from the human to the vehicle — but now and again, the opposite is necessary as well.
Vehicles will mostly inform passengers of pertinent information without requiring a particular response, for instance if there is a short delay ahead due to rail cars passing.
The goal can be to simply keep passengers informed, but at other times the car could think more actively about other human needs; knowing that a passenger is on the way to a grocery store, it could suggest heading to a closer one just a block from the vehicle’s current location.
“Autonomous” cars will also sometimes need to ask their passengers for direction in the face of apples-to-oranges decision points; faced with a traffic backup due to a fallen tree, should the car go around for a quicker route home, or wait in line for lower gas consumption?
By asking such questions a few times, vehicles could build up a behavioral profile for their owners, and make such decisions more aggressively in the future.
That sort of pro-active data gathering could be just as important to shaping a car’s behavior as any volume of driving data from the outside world.
Turn a car ride into a short vacation
By both listening to a passenger and watching their behavior, cars should also be able to greatly improve the experience of moving through traffic by tailoring the experience.
Not every passenger will want this sort of approach, of course, but those that do could have everything from the level of outside noise cancellation to the tint of the windows to the angle of the seat-back tailored to their apparent level of stress.
A person headed home with nothing else on the daily docket might receive a suggestion to stop for a treat on the way home, for instance.
There are, of course, still questions. It seems clear that privately owned cars would get their preferences from their owners, but what about public vehicles like autonomous taxis?
If there are multiple people in a self-driving car, whose needs should motivate the car’s actions?
Will it take some sort of average, or a majority rule approach?
Different operators and even manufacturers might end up with different answers to these questions.
All these ideas require advances in the data-gathering hardware and data-sifting software that allows real understanding of a human occupant, but it does seem that there will be ample time for that development to take place.
That’s because none of the most ambitious applications for autonomy can be considered until self-driving cars can drive everywhere, without any human intervention whatsoever.
That all-important ability is still a fair distance out, meaning that for the next several years the most creative minds in tech will be laser focused on teaching your car how to learn from you.
There’s simply no telling how far these internal data collection technologies could advance or how autonomous cars data will operate by the time they truly hit the road.
Free Data Collection Resources
Looking for resources to assist with collecting autonomous car data? Check out these helpful downloads:
The Ultimate Guide to Data Collection (PDF) – Learn how to collect data for emerging technology.
Eye Gaze Sample Set (Download) – Get a sample of high-quality eye gaze data.
Road, Car, and People Dataset (Download) – Training a system that requires road image data? Download our sample dataset.
Need help developing a custom dataset? Summa Linguae Technologies provides custom video & image data collection services to train your self-driving car AI.
3 Types of Speech Recognition Data (and What They’re Used For)
If you’re developing a speech product like a voice assistant or speech recognition software, at some point...
15 Ways to Customize Your Speech Data Collection Project
Before we start collecting voice data for your speech recognition project, you have choices to make. While...