KDD 19': Heterogeneous Graph Neural Network

🤗 Recommendation system paper challenge (28/50)

paper link

🤔 What problem do they solve?

They would like to generate Heterogeneous Graph embedding consisting of graph structure information and node content information.

😮 What are the challenges?

Few of them can jointly consider heterogeneous structural (graph) information as well as heterogeneous contents information of each node effectively.

  1. A node could carry unstructured content
  2. Different types of neighbors contributes differently to node embedding

😎 Overview of the models: HetGNN

They propose a heterogeneous graph neural network model to resolve this issue.

Sampling Heterogeneous Neighbors (C1)

Most of other GNNs models have some issues:

  1. Besides that, They are weakened by various neighbor sizes.
  2. They are not suitable for aggregating heterogeneous neighbors which have different content features.
  1. the sampled neighbor size of each node is fixed and the most frequently visited neighbors are selected;
  2. neighbors of the same type (having the same content features) are grouped such that type-based aggregation can be deployed.

Encoding Heterogeneous Contents (C2)

Given a node, it has different type neighbors.

Aggregating Heterogeneous Neighbors (C3)

As C1, given 1 node, we have neighboring nodes. Thanks to C2, each neighboring node can encode to 1 embedding.

Objective and Model Training

A graph context loss and a mini-batch gradient descent.

🥴 What else in this paper?

In this paper, they also discuss several experiments.

(RQ1) How does HetGNN perform vs. state-of-the-art baselines for various graph mining tasks, such as link prediction (RQ1–1), personalized recommendation (RQ1–2), and node classification & clustering (RQ1–3)?

Better

(RQ2) How does HetGNN perform vs. state-of-the-art baselines for inductive graph mining tasks, such as inductive node classification & clustering?

Better

(RQ3) How do different components, e.д., node heterogeneous contents encoder or heterogeneous neighbors aggregator, affect the model performance?

  • HetGNN has better performance than No-Neigh in most cases, demonstrating that aggregating neighbors information is effective for generating better node embeddings.
  • HetGNN outperforms Content-FC, indicating that the Bi-LSTM based content encoding is better than “shallow” encoding like FC for capturing “deep” content feature interactions.
  • HetGNN achieves better results than Type-FC, showing that selfattention is better than FC for capturing node type impact

(RQ4) How do various hyper-parameters, e.д., embedding dimension or the size of sampled heterogeneous neighbors set, impact the model performance?

🙃 Other related blogs:

KDD 19': Applying Deep Learning To Airbnb Search

🤩 Conference

ICCV: International Conference on Computer Vision