Smartwatches, with a few tweaks, can detect a surprising number of things your hands are doing like helping your spouse with washing dishes, chopping vegetables or petting a dog, say researchers from Carnegie Mellon University.
By making a few changes to the smartwatch’s operating system, they were able to use its accelerometer to recognise hand motions and, in some cases, bio-acoustic sounds associated with 25 different hand activities at around 95 percent accuracy.
Those 25 activities (including typing on a keyboard, washing dishes, petting a dog, pouring from a pitcher or cutting with scissors) are just the beginning of what might be possible to detect, the researchers said.
“We envision smartwatches as a unique beachhead on the body for capturing rich, everyday activities,” said Chris Harrison, Assistant Professor in Human-Computer Interaction Institute (HCII) at Carnegie.
“A wide variety of apps could be made smarter and more context-sensitive if our devices knew the activity of our bodies and hands,” he added.
Just as smartphones now can block text messages while a user is driving, future devices that sense hand activity might learn not to interrupt someone while they are doing certain work with their hands.
Sensing hand activity also lends itself to health-related apps — monitoring activities such as brushing teeth, washing hands or smoking a cigarette.
“Hand-sensing also might be used by apps that provide feedback to users who are learning a new skill, such as playing a musical instrument, or undergoing physical rehabilitation,” the study noted.
Apps might alert users to typing habits that could lead to repetitive strain injury (RSI), or assess the onset of motor impairments such as those associated with Parkinson’s disease.
To reach this conclusion, Harrison and his team began their exploration of hand activity detection by recruiting 50 people to wear specially programmed smartwatches for almost 1,000 hours while going about their daily activities.
More than 80 hand activities were labeled in this way, providing a unique dataset.
For now, users must wear the smartwatch on their active arm, rather than the passive (non-dominant) arm where people typically wear wristwatches, for the system to work.
Future experiments will explore what events can be detected using the passive arm.
Harrison and HCII PhD student Gierad Laput presented the findings at “CHI 2019”, the Association for Computing Machinery’s conference on human factors in computing systems in Glasgow, Scotland.