Repository landing page

We are not able to resolve this OAI Identifier to the repository landing page. If you are the repository manager for this record, please head to the Dashboard and adjust the settings.

Generation and Rendering of Interactive Ground Vegetation for Real-Time Testing and Validation of Computer Vision Algorithms

Abstract

In recent years, the realistic rendering of natural outdoor sceneries has become more and more important in many application areas. Regarding the development process of new algorithms for computer vision applications, testing and evaluation in real outdoor environments is time-consuming or hardly applicable in many cases. As a result, artificial testing environments are used, which differ from real-world environments, especially regarding realistically reacting ground vegetation. Thus, developers try to simulate natural environments in real-time virtual reality applications, which are commonly known as Virtual Testbeds. Since the first basic usage of Virtual Testbeds several years ago, the image quality of recent virtual environments has almost reached a level close to photorealism even in real-time due to new rendering approaches and increasing processing power of current graphics hardware. Thus, Virtual Testbeds can recently be applied in application areas like computer vision, that strongly rely on realistic scene representations. In this article, we introduce a novel ground vegetation rendering approach that is capable of generating large sceneries with realistic appearance and excellent performance. Our approach features wind animation, as well as object-to-grass interaction and delivers realistically appearing grass and shrubs at all distances and from all viewing angles. This greatly improves immersion, as well as acceptance, especially in virtual training applications. Nevertheless, the rendered results also fulfill important requirements for the computer vision aspect, like plausible geometry representation of the vegetation as well as its consistence during the entire simulation. Feature detection and matching algorithms are applied to our approach in localization scenarios of mobile robots in natural outdoor environments. We will show how the quality of feature matching results is influenced by violating the static scene constraint in the Virtual Testbed, as observed in highly unstructured, real-world outdoor scenes with wind and object-to-vegetation interaction

Similar works

This paper was published in Directory of Open Access Journals.

Having an issue?

Is data on this page outdated, violates copyrights or anything else? Report the problem now and we will take corresponding actions after reviewing your request.