Repository landing page

We are not able to resolve this OAI Identifier to the repository landing page. If you are the repository manager for this record, please head to the Dashboard and adjust the settings.

Integrating Neuromuscular and Touchscreen Input for Machine Control

Abstract

Current touchscreen interfaces are unable to distinguish between individual fingers or to determine poses associated with the user’s hand. This limits the use of touchscreens in recognizing user input. As discussed herein, a statistical model can be trained using training data that includes sensor readings known to be associated with various hand poses and gestures. The trained statistical model can be configured to determine arm, hand, and/or figure configurations and forces (e.g., handstates) based on sensor readings, e.g., obtained via a wearable device such as a wristband with wearable sensors. The statistical model can identify the input from the handstate detected by the wearable device. For example, the handstates can include identification of a portion of the hand that is interacting with the touchscreen, a user’s finger position relative to the touchscreen, an identification of which finger or fingers of the user’s hand are interacting with the touchscreen, etc. The handstates can be used to control any aspect(s) of the touchscreen or a connected device indirectly through the touchscreen

Similar works

This paper was published in Technical Disclosure Common.

Having an issue?

Is data on this page outdated, violates copyrights or anything else? Report the problem now and we will take corresponding actions after reviewing your request.