posted on 2024-05-01, 00:00authored byJaykumar Kakkad
Graph Neural Networks (GNNs) have demonstrated outstanding performance across various graph-based tasks. However, the resource-intensive nature of large GNN models and the need for more efficient deployment have motivated the development of graph distillation methods. In this study,we systematically evaluate and compare a range of graph distillation methods to assess their effectiveness and efficiency in node classification tasks. Additionally, we compare the distillation methods against other graph reduction methods, specifically coreset-based techniques. We employ a standardized benchmark dataset comprising various graphs and conduct rigorous experiments to measure the impact of graph distillation on the performance for the node classification task. Our novel benchmarking framework further analyzes the ability of preserving explanations and robustness of the distillation methods. We also identify the Pareto-optimal methods that exhibit superior performance, consistent explanation, and stability in the presence of noise. Our findings reveal key insights into the optimal choices of graph distillation techniques for specific use cases. This study serves as a valuable resource for researchers and practitioners on selecting the most suitable graph distillation approach in real-world applications