Repository landing page

We are not able to resolve this OAI Identifier to the repository landing page. If you are the repository manager for this record, please head to the Dashboard and adjust the settings.

Mobile Robotic Painting of Texture

Abstract

Robotic painting is well-established in controlled factory environments, but there is now potential for mobile robots to do functional painting tasks around the everyday world. An obvious first target for such robots is painting a uniform single color. A step further is the painting of textured images. Texture involves a varying appearance, and requires that paint is delivered accurately onto the physical surface to produce the desired effect. Robotic painting of texture is relevant for architecture and in themed environments. A key challenge for robotic painting of texture is to take a desired image as input, and to generate the paint commands to as closely as possible create the desired appearance, according to the robotic capabilities. This paper describes a deep learning approach to take an input ink map of a desired texture, and infer robotic paint commands to produce that texture. We analyze the trade-offs between quality of reconstructed appearance and ease of execution. Our method is general for different kinds of robotic paint delivery systems, but the emphasis here is on spray painting. More generally, the framework can be viewed as an approach for solving a specific class of inverse imaging problems

Similar works

Full text

thumbnail-image

Infoscience - École polytechnique fédérale de Lausanne

redirect
Last time updated on 09/07/2019

Having an issue?

Is data on this page outdated, violates copyrights or anything else? Report the problem now and we will take corresponding actions after reviewing your request.