Imagine if you could slide your finger up and down the side of your AirPods to control the volume. Or if a few taps on the aluminum shell of your toaster could set the perfect time and temperature for a toasted bagel.
This is the future envisioned by UltraSense, a new startup company in the United States that has been working on a new kind of sensor for touchscreen technologies. UltraSense believes its new sensor can drastically improve the way we interact with the world, from car door handles to virtual “buttons” on a phone or even smart home appliances.
That’s because the tiny sensor, which is smaller than the size of a ballpoint pen tip, can make nearly anything a touchscreen—including surfaces that aren’t traditionally suited to the technology, like metal, glass, wood, plastic, and ceramic.
Called “TouchPoint,” the sensor takes a new approach to haptic touch, using 3D ultrasound technology to detect how tiny sound waves are moving through a given surface. The sound waves can detect if a finger is touching a given surface and can discern between different kinds of touches, opening up the possibility for multifunctional gesturing. (Think about having a panel on the back of your phone that could help you slide your fingers around to zoom in or out or to focus or snap a picture—all with one hand).
Turn virtually any material into a touch interface. With the launch of Ultrasense, now you can. Learn more about our family of TouchPoint ultrasound #sensors in this video. #touchgesturehttps://t.co/55VcHKKNjx pic.twitter.com/R7qAPx0Dy6
— UltraSense Systems (@UltraSenseSys) December 17, 2019
Mass adoption of touchscreen tech—whether it comes from UltraSense or another company working on novel approaches to haptic touch—could mean we’ll no longer need mechanical switches, or buttons, to power our phones on and off or to change the temperature of our refrigerator. That means smart devices can get way smaller and potentially open up new capabilities altogether.
“We have seen a shift in the way we interact with our devices, where digital has replaced mechanical, and the move to virtual buttons and surface gestures is accelerating,” Mo Maghsoudnia, founder and CEO of UltraSense, said in a press statement. “The use of ultrasound in touch user interfaces has not been implemented in such a novel way until now.”
The sensors use very little power and don’t put any additional computing strain onto a system’s main processor, according to UltraSense. And even if you’re wearing gloves or a surface is covered, your touches are still detected.
UltraSense isn’t the only player in this space. In 2017, Google acquired a U.K.-based company called Redux, which uses vibrations to turn device screens into speakers and haptic feedback to mimic the feeling of buttons, sliders, and dials. Tanvas Haptics in Chicago, meanwhile, is using electrostatics to create new dimensions of touch sensing, like the ability to feel the edge of keys or the swipe of a turned page on a Kindle. The Tanvas platform also takes advantage of ultrasonic sound waves, which are sent to a screen to deliver feedback to the user by changing the friction accordingly.