Out-of-distribution Detection and Generalized Class Discovery for Open-world Classification
thesis
posted on 2023-08-01, 00:00authored bySepideh Esmaeilpourcharandabi
Despite the remarkable success of deep classification models on fine-grained visual recognition tasks, the training of these models is often based on a restricting assumption, i.e., the test samples belong to the same classes as the training data. This is in contrast to the dynamic and unpredictable real-world setting. Therefore, the deployment of classification models in the real world requires the detection and learning of samples from unknown or out-of-distribution classes without relying on external supervision.
In this dissertation, we focus on training open-world classification models that can 1) detect out-of-distribution (OOD) samples at test time and 2) discover the classes of these samples in an unlabeled pool. In our first work, we propose an augmentation-based similarity learning method for OOD detection which outperforms costly generative techniques. In our second work, we introduce the problem of zero-shot OOD detection. While all previous methods in the literature rely on in-distribution samples for training their models, our approach does not train any classifiers. Instead, we generate supporting evidence for OOD detection by exploiting the multi-modal foundation model CLIP. The proposed method outperforms state-of-the-art supervised baselines by a large margin. Our third work is dedicated to the task of generalized class discovery. By iteratively detecting OOD or novel samples and optimizing a weakly-supervised contrastive loss, we mitigate the risk of overfitting to the old classes and improve the accuracy of novel class discovery.