Repository landing page

We are not able to resolve this OAI Identifier to the repository landing page. If you are the repository manager for this record, please head to the Dashboard and adjust the settings.

SHOW ME WHAT YOU MEAN: Gestures and drawings on physical objects as means for remote collaboration and guidance

Abstract

This thesis presents findings based on the study of remote projected interaction and guidance on physical objects. First, the results are based on the study of literature and previous research in the fields of ubiqutious computing and environment, augmented reality, remote collaboration and guidance. Second, the results are based on findings through testing projector technology in remote interaction and guidance with users with the help of prototype. Previous studies indicate that guidance on physical objects is seen as valuable and in such interaction, the focus should be shifted to the actual object. This thesis contributes to previous research and suggest better integration of hand gestures and drawings into remote collaboration and guidance. Projected interaction model, described in this thesis, enhances the feeling of togetherness between remote users (expert and novice), and provides critical help in conversational grounding in remote collaboration and guidance with physical objects

Similar works

This paper was published in Aaltodoc Publication Archive.

Having an issue?

Is data on this page outdated, violates copyrights or anything else? Report the problem now and we will take corresponding actions after reviewing your request.