Repository landing page

We are not able to resolve this OAI Identifier to the repository landing page. If you are the repository manager for this record, please head to the Dashboard and adjust the settings.

A Debiasing Variational Autoencoder for Deforestation Mapping

Abstract

Deep Learning (DL) algorithms provide numerous benefits in different applications, and they usually yield successful results in scenarios with enough labeled training data and similar class proportions. However, the labeling procedure is a cost and time-consuming task. Furthermore, numerous real-world classification problems present a high level of class imbalance, as the number of samples from the classes of interest differ significantly. In various cases, such conditions tend to promote the creation of biased systems, which negatively impact their performance. Designing unbiased systems has been an active research topic, and recently some DL-based techniques have demonstrated encouraging results in that regard. In this work, we introduce an extension of the Debiasing Variational Autoencoder (DB-VAE) for semantic segmentation. The approach is based on an end-to-end DL scheme and employs the learned latent variables to adjust the individual sampling probabilities of data points during the training process. For that purpose, we adapted the original DB-VAE architecture for dense labeling in the context of deforestation mapping. Experiments were carried out on a region of the Brazilian Amazon, using Sentinel-2 data and the deforestation map from the PRODES project. The reported results show that the proposed DB-VAE approach is able to learn and identify under-represented samples, and select them more frequently in the training batches, consequently delivering superior classification metrics

Similar works

Full text

thumbnail-image

Institutionelles Repositorium der Leibniz Universität Hannover

redirect
Last time updated on 20/08/2023

Having an issue?

Is data on this page outdated, violates copyrights or anything else? Report the problem now and we will take corresponding actions after reviewing your request.