Interaction with a device is the foundation for any user experience. With the Apple Watch, user interaction usually comes in two flavors: physical touch or Siri. But Apple is developing alternatives to the typical user interaction methods.
One such alternative is thoroughly laid out in a patent that was recently awarded to Apple.
Of course, patents are not always good indicators of Apple’s future plans. There’s plenty of technology described in Apple patents that never make it to a final product. There’s also no way to know when such technology will be used in a user-facing device.
Despite that, the possibilities of the patent are extremely interesting.
The patent in question is titled “motion and gesture input from a wearable device.” The patent application was first filed in February 2015 and made public by the U.S. Patent & Trademark Office in March 2016. The patent credits its invention to a handful of what are likely Apple engineers and other staff.
It received some media coverage at the time, but just last week, the patent was finally approved and granted to Apple. Basically, Apple now owns the intellectual property (the invention) described in the patent.
That IP lays out a way to control an Apple Watch and activate or dictate certain functions through hand motions or gestures.
At the surface, that may sound like a simple concept. But the way the system works and the possible applications are actually fairly complicated and intriguing.
Of course, it’s written in the dense, technical language that most patents are written in. But that doesn’t mean we can’t glean some interesting details from it.
How it Works
Like many other Apple products, the platform would rely on a suite of sensors. They might include optical, inertial, or mechanical contact sensors. But it could include components that can detect subtle muscle movements and electrical signals within the body. These are called myoelectric sensors.
That last sensor is arguably the most interesting. The system can detect a user’s tendon movement or muscular electrical signal to determine the motion of a hand or the gesture it’s making.
The brain sends electrical signals to the fingers, wrists, and hands when moved. The sensor would pick up these signals or the resulting muscular movements and translate them into usable commands.
Once the device detects a motion or gesture, it can then analyze and interpret it. Depending on which gesture was performed, the wearable will carry out the desired operation. Presumably, these operations would be tied to predetermined gestures. There might also be an opportunity to customize them.
Because of the complexity of the sensors and the gestures or movements they can detect, this is a far-ranging patent.
In its simplest form, the technology could allow a user to perform basic functionality on their Apple Watch. The patent lays out a few possible examples.
- Placing your hand palm-down and pausing could automatically decline a phone call.
- Moving a hand up or down at a constant, slow speed could control music volume.
- A faster hand-down motion could mute the volume entirely.
- A hand-wave motion could allow a user to easily scroll through a website or skip to the next page in an ebook.
- Extending a thumb and little finger, while keeping other fingers fixed, could allow a user to make a phone call.
- Making a hand motion toward yourself could activate a ringer, allowing users to easily find their device (similar to the Find My iPhone functionality).
But there are other capabilities that could come about as a result of the patented technology. For one, the patent describes how a user could unlock their car door with a gesture (provided that the car is synced to their Apple Watch or iPhone, presumably).
Apple’s ambitions don’t end there. One portion of the patent details how the system could actually detect a user using sign language. The signs could then be analyzed and converted to spoken words or transcribed as written language.
Apple’s Other Gesture-Based Systems
The Apple Watch isn’t the only device that could benefit from the addition of gesture or movement-based controls in the future.
A recent Bloomberg report revealed that Apple is developing “touchless” gesture-based commands for an upcoming iPhone. These gestures, true to their name, would not require physical touch to be used.
Users could hover their hands over an iPhone screen and perform a gesture to activate certain functionality. Similar features rely on a motion sensor in a camera, but Apple’s system could use another sensor embedded within the smartphone’s display.
Additionally, Apple has filed previous patents for using hand motions to control a Mac platform or even to carry out certain functionality when in a driverless car.
Presumably, these systems rely on optical-based sensors that can analyze hand movements.
But the fact that Apple is developing a system that could “read” gestures through advanced muscular and bodily electrical signals is proof enough that its ambitions stretch further than replacing the Home button.
Mike is a freelance journalist from San Diego, California.
While he primarily covers Apple and consumer technology, he has past experience writing about public safety, local government, and education for a variety of publications.
He’s worn quite a few hats in the journalism field, including writer, editor, and news designer.