BHOI-THESIS-2019.pdf (3.66 MB)
Invariant Kernels for Few-shot Learning
thesis
posted on 2019-08-06, 00:00 authored by Amlaan BhoiRecent advances in few-shot learning algorithms focus on the development of meta-learning or improvements in distance-based algorithms. However, the majority of these approaches do not take into consideration the robustness of the model to data that is augmented by various transforms. Transformations such as rotation, scaling, shifting, shearing and various other affine transforms exist, which might negatively impact the performance of a trained model. While such variations can be instilled by training time random augmentation, the computational cost incurred for training such models to achieve robustness is significant.
In this thesis, we propose a general framework to induce a variety of invariances such as rotation, shifting, shearing and more in few-shot learning by extending orbit embeddings to patch-wise image representations to preserve spatial-invariant structure required by convolutional neural networks. However, the computational cost of these kernels can be expensive when done on millions of patches/samples in a dataset. Thus, we utilize Nyström’s method to generate a low-rank matrix approximation of the kernel matrix by sampling a smaller number of data points. Finally, these transformed features are passed as input to a modified Relation Network for few-shot learning. When analyzing the performance of these model under scenarios with heavy test time data augmentation, we show results that empirically prove the effectiveness of our approach when compared to baseline methods.
History
Advisor
Zhang, XinhuaChair
Zhang, XinhuaDepartment
Computer ScienceDegree Grantor
University of Illinois at ChicagoDegree Level
- Masters
Committee Member
Berger-Wolf, Tanya Sun, XiaoruiSubmitted date
May 2019Issue date
2019-04-12Usage metrics
Categories
No categories selectedKeywords
Licence
Exports
RefWorks
BibTeX
Ref. manager
Endnote
DataCite
NLM
DC