Google, a company that spends billions of dollars on research and development, still hasn’t found a good reason for everyone to own a pair of smart glasses. But an international team of researchers is rethinking how upgraded glasses could be useful—by turning the wearer’s nose into a remote control for other devices.
Computer scientists from KAIST University in South Korea, the University of St. Andrews in Scotland, the Georgia Institute of Technology in the United States, and Keio University in Japan, took a unique approach to the smart glasses they designed. As detailed in a new paper, ItchyNose: Discreet Gesture Interaction using EOG Sensors in Smart Eyewear, being presented today at the International Symposium on Wearable Computers, instead of trying to integrate a screen, camera, touch panel, and turn the specs into a wearable computer, they simply added a series of electrooculography sensors to the bridge and nose pads of the glasses.
Those EOG sensors are designed to measure electrical signals in and around the eye, and have been used for diagnosing certain medical conditions, but also as a way to record eye movements in the special-effects industry, allowing an actor’s facial performance to be captured and applied to a CG character. The sensors are being used in a similar fashion here, but instead of recording eye movements, the EOG sensors in the smart eyewear are detecting movements of the wearer’s nose.
The smart glasses, which look no different than a regular pair of specs, are apparently able to discern between the wearer flicking, holding, or rubbing their nose with a finger, and those subtle movements can be translated into remote commands for other devices.
So imagine you’re sitting in a meeting and have to look like you’re paying rapt attention to your boss’ every word, but you’d rather be browsing your email on your laptop. A simple scratch of your nose, which wouldn’t look out of place, could let you jump between messages. The glasses could also be used to remotely operate basic functions on a smartphone, like adjusting volume, or sending pre-determined text message responses.
The key aspect of these smart glasses is that the gestures don’t necessarily look like you’re interacting with another device. The constant rubbing and flicking might make others wonder if you’ve got a substance abuse problem, but they certainly won’t be able to tell you’re just browsing Tinder. And this might be the direction we need to take with smart wearables. They don’t have to duplicate everything our smartphones and computers can do, but instead serve as an additional way to interact with the devices we can’t live without.
[ItchyNose: Discreet Gesture Interaction using EOG Sensors in Smart Eyewear]
Let's block ads! (Why?)