University of Illinois Chicago
Browse

Distributionally Adversarial Learning

Download (1.81 MB)
thesis
posted on 2021-08-01, 00:00 authored by Zhan Shi
Adversarial machines, where a learner competes against an adversary, have regained much recent interest in machine learning. This presentation will start with distributionally robust optimization (DRO), introducing our work on adversarial prediction. The adversarial multivariate prediction is often computationally intractable due to the exponential number of variables. We first reduce those exponentially-sized problems to polynomially-sized convex-concave saddle point problems. Then we develop a new stochastic variance-reduced algorithm to efficiently solve them, which allows any Bregman divergence as a proximal function and achieves linear convergence rates. Based on the theory of DRO with Wasserstein distance, we also reveal that Adversarial Training can be upper bounded by Lipschitz-regularized empirical risk minimization. However, tightly enforcing the Lipschitz constant of neural networks is known to be difficult. Interestingly, by focusing on kernel machines, we manage to show that the Lipschitz constant of functions in the reproducing kernel Hilbert space induced by product kernels can be tightly enforced in polynomial time. Lastly, we investigate the certification of robustness in the context of graph convolutional networks (GCNs). Specifically, we develop efficient topological attack algorithms for GCN in the application of graph classification. Then we propose the first algorithm for certifying the robustness of GCNs to topological attacks. Our method is based on Lagrange dualization and convex envelope, which result in tight approximation bounds that are efficiently computable by dynamic programming. When used in conjunction with robust training, it allows an increased number of graphs to be certified as robust.

History

Advisor

Zhang, Xinhua

Chair

Zhang, Xinhua

Department

Computer Science

Degree Grantor

University of Illinois at Chicago

Degree Level

  • Doctoral

Degree name

PhD, Doctor of Philosophy

Committee Member

Kash, Ian Reyzin, Lev Ziebart, Brian Yu, Yaoliang

Submitted date

August 2021

Thesis type

application/pdf

Language

  • en

Usage metrics

    Categories

    No categories selected

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC