Repository landing page

We are not able to resolve this OAI Identifier to the repository landing page. If you are the repository manager for this record, please head to the Dashboard and adjust the settings.

High Dimensional Low Rank Plus Sparse Matrix Decomposition

Abstract

This paper is concerned with the problem of low-rank plus sparse matrix decomposition for big data. Conventional algorithms for matrix decomposition use the entire data to extract the low-rank and sparse components, and are based on optimization problems with complexity that scales with the dimension of the data, which limits their scalability. Furthermore, existing randomized approaches mostly rely on uniform random sampling, which is quite inefficient for many real world data matrices that exhibit additional structures (e.g., clustering). In this paper, a scalable subspace-pursuit approach that transforms the decomposition problem to a subspace learning problem is proposed. The decomposition is carried out by using a small data sketch formed from sampled columns/rows. Even when the data are sampled uniformly at random, it is shown that the sufficient number of sampled columns/rows is roughly O (r μ), where μ is the coherency parameter and rr is the rank of the low-rank component. In addition, adaptive sampling algorithms are proposed to address the problem of columns/rows sampling from structured data. We provide an analysis of the proposed method with adaptive sampling and show that adaptive sampling makes the required number of sampled columns/rows invariant to the distribution of the data. The proposed approach is amenable to online implementation and an online scheme is proposed

Similar works

Full text

thumbnail-image

University of Central Florida (UCF): STARS (Showcase of Text, Archives, Research & Scholarship)

redirect
Last time updated on 18/10/2022

Having an issue?

Is data on this page outdated, violates copyrights or anything else? Report the problem now and we will take corresponding actions after reviewing your request.