Repository landing page

We are not able to resolve this OAI Identifier to the repository landing page. If you are the repository manager for this record, please head to the Dashboard and adjust the settings.

SmartEAR: Smartwatch-based Unsupervised Learning for Multi-modal Signal Analysis in Opportunistic Sensing Framework

Abstract

Wrist-bands such as smartwatches have become an unobtrusive interface for collecting physiological and contextual data from users. Smartwatches are being used for smart healthcare, telecare, and wellness monitoring. In this paper, we used data collected from the AnEAR framework leveraging smartwatches to gather and store physiological data from patients in naturalistic settings. This data included temperature, galvanic skin response (GSR), acceleration, and heart rate (HR). In particular, we focused on HR and acceleration, as these two modalities are often correlated. Since the data was unlabeled we relied on unsupervised learning for multi-modal signal analysis. We propose using k-means clustering, GMM clustering, and Self-Organizing maps based on Neural Networks for group the multi-modal data into homogeneous clusters. This strategy helped in discovering latent structures in our data

Similar works

This paper was published in DigitalCommons@URI.

Having an issue?

Is data on this page outdated, violates copyrights or anything else? Report the problem now and we will take corresponding actions after reviewing your request.