Exploring Time Granularity on Temporal Graphs for Dynamic Link Prediction in Real-world Networks

Abstract

Dynamic Graph Neural Networks (DGNNs) have emerged as the predominant approach for processing dynamic graph-structured data. However, the influence of temporal information on model performance and robustness remains insufficiently explored, particularly regarding how models address prediction tasks with different time granularities. In this paper, we explore the impact of time granularity when training DGNNs on dynamic graphs through extensive experiments. We examine graphs derived from various domains and compare three different DGNNs to the baseline model across four varied time granularities. We mainly consider the interplay between time granularities, model architectures, and negative sampling strategies to obtain general conclusions. Our results reveal that a sophisticated memory mechanism and proper time granularity are crucial for a DGNN to deliver competitive and robust performance in the dynamic link prediction task. We also discuss drawbacks in considered models and datasets and propose promising directions for future research on the time granularity of temporal graphs.

Publication
In NeurIPS 2023 Workshop “Temporal Graph Learning Workshop (TGL)”
Xiangjian Jiang
Xiangjian Jiang
PhD Student in Computer Science

My research interests include explainable AI and data mining, with a particular focus on low-sample-size regimes.