It’s a problem faced every winter by millions of drivers: You come out to your car in the morning, or after a long day of work, only to find that a perfectly even blanket of snow has been laid across the entire landscape. Where’s the road? Most of us can remember the approximate boundaries of our driveway or the street nearest to our homes—but what about, say, a random mile of I-70 between Indianapolis and St. Louis? And what about the next exit ramp on that freeway? When there’s no visible road marking, how do you know where the soft shoulder ends and the ramp begins?
Autonomous vehicles face the same problem, made far worse by their inability to just take a wild guess and accept whatever consequences are in store. Right now they rely on three means of location: GPS, lidar, and camera image processing. The first leg of that tripod can be cut out by weather, overpasses, or the surrounding landscape; it’s also not accurate enough at highway speeds to ensure that nothing bad ever happens. The other two legs are rendered more or less useless by heavy rain, snow, or any kind of unexpected ground cover. Even a light blanket of autumn leaves can make it impossible for an autonomous vehicle to precisely locate itself at speed.
WaveSense thinks they have the missing piece of this puzzle, courtesy of some MIT-developed military tech to which they have the exclusive rights. We sat down with WaveSense CEO Tarik Bolak and CTO Byron Stanley to learn more.
The idea behind WaveSense is simple enough in concept: Imagine driving a steel tube ten feet deep into the ground and pulling out the resulting “core.” Geologists do this all the time to get a sense of historical rock formation and possibilities for drilling. Now imagine that you have a core for every square inch of the ground beneath you. No two cores would be exactly the same, because the distribution of pavement, rock, water, and soil vary greatly even in short distances. If you could scan an area of a few square feet, you would have a “fingerprint” of the ground over which you’re traveling. You could then compare fingerprint to a database and figure out exactly where you are.
WaveSense uses ground-penetrating radar to do precisely that. Their test vehicles are driving through the top ten urban markets and major cross-country freeways, getting the fingerprints of the ground beneath them. Those fingerprints are then shared with autonomous vehicles using the WaveSense system.
“The first few inches of pavement can be temporary, thanks to resurfacing and whatnot,” Byron Stanley noted. “But the several feet beneath that tend to be permanent.” Tests have shown that WaveSense can pinpoint its location within about one inch while traveling at 60mph, even when there’s snow or water on the road.
“We won’t always have data—all-steel bridges with no asphalt surface, for example, aren’t responsive to this technique—but we view our technology as filling in the gaps that lidar and camera processing can’t cover,” Tarik Bolat said. As LJK Setright once said about the turbocharged engine and the automatic transmission, one can be at work when the other is not, and in the end you get complete coverage.
WaveSense has identified two primary markets for its product: Autonomous ride-shares in urban areas, and long-haul trucking. “We aren’t Google Maps,” Bolat says. “We won’t be driving down a dirt road in the middle of nowhere.” With that said, the original tests of WaveSense occurred in Afghanistan, where nine-ton Army trucks proved perfectly capable of driving themselves across that country’s damaged infrastructure using WaveSense tech for geolocation.
In the future, your Johnny Cab might use GPS for approximate location, WaveSense for precise location, and visual systems to look out for other vehicles and pedestrians. The same would be true of headless cross-country tractor-trailers that would be expected to operate in rain or shine.
We had a few questions about WaveSense:
How powerful, or dangerous, is the radiation? “About one-tenth of a cellphone signal,” Stanley says.
How large is the required equipment? “Current test models don’t present any kind of packaging problem for any application.”
How much data is involved? Is it stored on the cloud or a hard drive? “It’s about one-eighth as much data stream as a camera feed […] you can fit the whole thing, everything you would need for your operational area, on a hard drive in the vehicle.”
Who’s interested? “We are talking to major OEMs in every market. European and American automakers are showing the most interest.”
How long until it’s in production? “You’ll see the OEMs testing in force within a year. Actual production deployment is a couple of years away.”
What I like most about WaveSense, from a practical perspective, is its inherent resistance to sabotage. GPS can be blinded; visual sensors can be taped over, covered with dirt, or shown deceptive information. But the only way to fool Wave Sense would be to dig under the road and change that “fingerprint.” Sure, there are all sorts of unsavory people out there who have acquired expertise in tunneling—but they’re not going to do it as a prank, the way that tomorrow’s hooligans might find joy in skating down a line of parked autonomous vehicles with a spray can of Rust-Oleum.
The cost and hassle of deploying WaveSense into future autonomous vehicles remains to be disclosed, but the company can afford to be a little reticent: They own exclusive rights to this technology for the lifetime of several recently-granted patents. Is this the missing piece of the autonomous puzzle? On first glance, it certainly looks that way.