Case Studies Temporal Graph Learning for Link Prediction
research Academic Research · Nov–Dec 2019

Temporal Graph Learning for Link Prediction

Research internship at the National University of Singapore with Prof. Bryan Hooi — developed a temporal attention model for link prediction on dynamic graphs, achieving 86% AUC on the College Messages dataset.

Problem

Existing graph ML approaches treated graphs as static — losing temporal dynamics that are critical for predicting which connections will form next in evolving networks.

Outcome

86% AUC on College Messages dataset — top performance using temporal attention over node2vec, TMF, CTDNE, and BANE baselines.

Overview

During my undergraduate studies at IIT Bombay, I completed a research internship at the National University of Singapore with Prof. Bryan Hooi’s group. The focus was temporal graph representation learning — specifically, using attention mechanisms to capture how relationships evolve over time in dynamic networks, and using that representation to predict which edges will form next.

This was my first exposure to cutting-edge ML research, and it shaped how I think about representation learning and temporal modeling.

The Problem

Most graph ML work at the time treated graphs as static — you had a fixed set of nodes and edges, and learned embeddings from that structure. But real-world networks evolve. Social connections form and dissolve. Financial transactions create temporal patterns. Information spreads through a network in time-ordered sequences.

For link prediction — predicting which connections will form in the future — the temporal dynamics are not noise, they’re signal. A model that ignores when interactions happened is throwing away critical information.

Why It Mattered

Temporal link prediction has applications in social network recommendation, fraud detection (identifying unusual connection patterns), academic collaboration prediction, and dynamic knowledge graph completion. The research contributed to the broader question of how to represent time-evolving graphs in a way that is useful for downstream prediction tasks.

Data & Inputs

The temporal nature of the data was the key feature — the timestamp on each interaction was the signal we were trying to leverage.

Approach

I implemented and benchmarked multiple temporal graph representation methods:

The key contribution was developing a temporal attention model that could:

  1. Encode the sequence of interactions a node has participated in
  2. Weight recent interactions more heavily using an attention mechanism
  3. Combine temporal interaction history with structural graph features for link prediction

The attention mechanism was the critical design choice — it let the model learn which past interactions were most predictive for future connections, rather than assuming recency was always most important.

Engineering & Implementation

The implementation discipline — same data splits, same negative sampling, same evaluation metric — was essential for meaningful comparisons.

Results & Impact

Limitations & What I’d Do Differently

The attention model was trained end-to-end on the link prediction objective — it might benefit from pre-training on a self-supervised temporal prediction task before fine-tuning on link prediction.

The model assumed that all nodes had sufficient temporal interaction history to learn meaningful representations. Nodes that are new to the network (cold start) needed special handling that wasn’t part of this implementation.

Stack

Python, PyTorch, NetworkX, NumPy, Pandas, Matplotlib

Stack

Python PyTorch NetworkX NumPy Pandas
graph-ml research temporal-graphs link-prediction attention

Lets collaborate!

Whether you need a quantitative researcher, an machine learning systems builder, or a technical advisor — I'm available for select consulting engagements.

Get in Touch →