University of Illinois Chicago
Browse

Data Distillation for Enhancing Neural Frameworks in Learning Graph Distance Functions

Download (405.21 kB)
thesis
posted on 2024-12-01, 00:00 authored by Jaspal Rana Jannu
Graph Edit Distance (GED) is a widely used method for measuring the similarity between two graphs, with numerous applications in domains such as molecular comparison and social network analysis. However, computing GED is NP-hard, making exact calculations computationally expensive for large datasets. While recent neural network-based approaches have emerged to approximate GED, these methods are typically supervised and require large amounts of training data with exact GED computations, which has a significant computational cost. To address the challenge of requiring large amounts of computational resources for training neural networks, data distillation techniques have been proposed in the context of graph and node classification tasks, but there is a lack of similar methods specifically for GED prediction. In this work, we explore approaches for designing data distillation techniques specifically for the GED task. One such approach leverages computational trees to reduce the size of training datasets required for GED prediction models. These approaches can improve efficiency and scalability of GED prediction models, with potential applications across various graph-based domains.

History

Advisor

Sourav Medya

Department

Computer Science

Degree Grantor

University of Illinois Chicago

Degree Level

  • Masters

Degree name

MS, Master of Science

Committee Member

Xinhua Zhang Bing Liu

Thesis type

application/pdf

Language

  • en

Usage metrics

    Categories

    No categories selected

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC