University of Illinois at Chicago
Browse
SHA-DISSERTATION-2018.pdf (26.09 MB)

Sparsity and Manifold Methods in Image and Higher Dimensional Data Representation

Download (26.09 MB)
thesis
posted on 2018-11-27, 00:00 authored by Lingdao Sha
One of the most important concept underlines recent development of image and signal processing is sparsity. Specifically, most applications in getting signals/images of interest can be well approximated by a linear combination of few active elements of a dictionary. The dictionary or base is a super set that can be used to represent the internal structure of signals. Utilization of sparsity can simplify the problem of singals/images processing, storage and representation. However, getting a good sparsity representation and efficient recovery from sparse data can be a very complex problem. Therefore, great efforts have been devoted to find the optimal way of sparse representation as well as recovery algorithms based on different purposes of applications. This thesis is dedicated to the study of sparse representation in image processing (denoising, deblurring), image representation (clustering, classification), tensor compressed sensing, and medical image processing. We first present a graph Laplacian regularized sparse coding method in image representation and restoration. As most existing sparse coding approaches failed to consider the fact that high dimensional data naturally reside on geometrical structure of the data space. In this thesis, we proposed a generalized framework for image restoration and representation by combining sparse coding and graph Laplacian algorithms. We show that by adding structural and high dimensional information as regularization terms, sparse representation can be boosted in terms of image processing and representation. We then present a kronecker least angle regression (Kron-LARS) algorithm as a generalization of classical vector version least-angle regression (LARS) algorithm for solving underdetermined linear algebra problem. We demonstrated that our Kron-LARS algorithm can be used to efficiently solving high-dimensional data (tensor) compressed sensing problem where vector version LARS will be bottlenecked in terms of running memory and computational complexity. We also proposed a more efficient N dimensional block sparse LARS (NBS-LARS) by utilizing the block sparsity property of high-dimensional data. Thirdly, we present a graph regularized sparse non-negative matrix factorization algorithm in med- ical image color normalization. We show that our proposed unsupervised color normaliztion algorithm outperforms existing popular algorithms such as ICA, PCA, NMF and SNMF both qualitatively and quantitively. Finally, we present a locally linear embedded sparse coding algorithm for image representation. To solve the proposed sparse coding problem, we proposed an efficient modified online dictionary learn- ing algorithm which converges faster than the existing algorithms for solving graph regularized sparse coding problem.

History

Advisor

Schonfeld, DanWang, Jing

Chair

Schonfeld, DanWang, Jing

Department

Electrical and Computer Engineering

Degree Grantor

University of Illinois at Chicago

Degree Level

  • Doctoral

Committee Member

Gann, Peter Zhou, Joe Soltanalian, Mojtaba

Submitted date

August 2018

Issue date

2018-07-30