The touchscreen that doesn't require... touching?
Studies have shown that coronavirus can remain on plastic and glass for anywhere between two hours and a week, meaning touchscreens used in public places got to be constantly disinfected to stop transmission. To tackle that, researchers from Cambridge University have developed a ‘no-touch touchscreen’ that uses AI to predict a user’s intention before their hand reaches the display. The screen was originally designed to be used in cars, but the engineers who built it claim it could even have widespread applications during a global outbreak.
The ‘predictive touch’ technology will be retrofitted to existing displays and will be accustomed to prevent the spread of pathogens on touchscreens at supermarket check-outs, ATMs and ticket terminals at railway stations. “Touchscreens and other interactive displays are something most of the people use multiple times per day, but they will be difficult to use while in motion, whether that’s driving a car or changing the music on your phone while you’re running,” said Simon Godsill from the university’s department of engineering.
Other touch-free technologies include gesture control, which might be found on the newest generation of Google and Samsung smartphones, also on as some smart TVs. Speaking of Google, they have recently received approval from US regulators to deploy a futuristic technology that enables smart devices to be controlled by hand gestures alone. The Federal Communications Commission (FCC) approved Project Soli, which was first announced by Google four years ago, saying that it might "serve the general public interest" to thrust ahead with the event of the technology. The interactive system uses radar-based motion sensors to detect and track hand movements with millimetre accuracy. This allows people to interact with devices without the requirement for any sort of physical controls.
"Even though these controls are virtual, the interactions feel physical and responsive," the Project Soli website states. "Feedback is generated by the tactile sensation of fingers touching one another. Without the constraints of physical controls, these virtual tools can strive against the fluidity and precision of our natural human hand motion." The idea is that individuals are going to be able to control everything from TVs to smartwatches in an intuitive, touch-free way. Familiar gestures, like pushing a button or turning a volume knob, allow an individual's hand to be transformed into a universal remote.
Project Soli Founder Ivan Pouyrev explained his motivation behind the endeavour during a video posted in 2015. "Capturing the probabilities of human hands was one among my passions," he said. "How could we take this incredible capability, the finesse of human actions, and apply it to the virtual world?" He explained that the technology can fit onto one chip measuring 8mm by 10mm, allowing it to suit during a large sort of devices. "What is most fun about it's that you just can shrink the whole radar and put it in a very tiny chip," he said. "That's what makes this approach so promising. It's extremely reliable, there's nothing to interrupt. There aren't any moving parts, there aren't any lenses, there's nothing, it is a piece of sand on your board."
Haptic feedback technology also offers the simplest way to interact with digital devices and environments, though it still needs developing and is yet to ascertain broad commercial use. “Our technology has numerous advantages over more basic mid-air interaction techniques or conventional gesture recognition because it supports intuitive interactions with legacy interface designs and doesn’t require any learning on the part of the user,” said Dr Bashar Ahmad, who led the development of the touchless screen. “It fundamentally relies on the system to predict what the user intends and might be incorporated into both new and existing touchscreens and other interactive display technologies.”
Thumbnail credit: Jaguar-LandRover