Repository landing page

We are not able to resolve this OAI Identifier to the repository landing page. If you are the repository manager for this record, please head to the Dashboard and adjust the settings.

Fast human motion prediction for human-robot collaboration with wearable interface

Abstract

In this paper, we propose a novel human-robot interface capable to anticipate the user intention while performing reaching movements on a working bench in order to plan the action of a collaborative robot. The system integrates two levels of prediction: motion intention prediction, to detect movements onset and offset; motion direction prediction, based on Gaussian Mixture Model (GMM) trained with IMU and EMG data following an evidence accumulation approach. Novel dynamic stopping criteria have been proposed to flexibly adjust the trade-off between early anticipation and accuracy. Results show that our system outperforms previous methods, achieving a real-time classification accuracy of 94.3±2.9% after 160.0msec±80.0msec from movement onset. The proposed interface can find many applications in the Industry 4.0 framework, where it is crucial for autonomous and collaborative robots to understand human movements as soon as possible to avoid accidents and injuries

Similar works

Full text

thumbnail-image

Archivio istituzionale della ricerca - Università di Padova

redirect
Last time updated on 10/09/2020

Having an issue?

Is data on this page outdated, violates copyrights or anything else? Report the problem now and we will take corresponding actions after reviewing your request.