dgl.nn (PyTorch)ο
Conv Layersο
Graph convolutional layer from SemiSupervised Classification with Graph Convolutional Networks 

This module normalizes positive scalar edge weights on a graph following the form in GCN. 

Relational graph convolution layer from Modeling Relational Data with Graph Convolutional Networks 

Topology Adaptive Graph Convolutional layer from Topology Adaptive Graph Convolutional Networks 

Graph attention layer from Graph Attention Network 

GATv2 from How Attentive are Graph Attention Networks? 

Graph attention layer that handles edge features from RossmannToolbox (see supplementary data) 

Graph attention layer with edge features from SCENE 

EdgeConv layer from Dynamic Graph CNN for Learning on Point Clouds 

GraphSAGE layer from Inductive Representation Learning on Large Graphs 

SGC layer from Simplifying Graph Convolutional Networks 

Approximate Personalized Propagation of Neural Predictions layer from Predict then Propagate: Graph Neural Networks meet Personalized PageRank 

Graph Isomorphism Network layer from How Powerful are Graph Neural Networks? 

Graph Isomorphism Network with Edge Features, introduced by Strategies for Pretraining Graph Neural Networks 

Gated Graph Convolution layer from Gated Graph Sequence Neural Networks 

Gated graph convolutional layer from Benchmarking Graph Neural Networks 

Gaussian Mixture Model Convolution layer from Geometric Deep Learning on Graphs and Manifolds using Mixture Model CNNs 

Chebyshev Spectral Graph Convolution layer from Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering 

Attentionbased Graph Neural Network layer from Attentionbased Graph Neural Network for SemiSupervised Learning 

Graph Convolution layer from Neural Message Passing for Quantum Chemistry 

Atomic Convolution Layer from Atomic Convolutional Networks for Predicting ProteinLigand Binding Affinity 

CFConv from SchNet: A continuousfilter convolutional neural network for modeling quantum interactions 

Apply dot product version of self attention in Graph Attention Network 

Convolution together with iteratively reweighting least squre from Graph Neural Networks Inspired by Classical Iterative Algorithms 

Description Combine propagation and attention together. 

Graph Convolutional Network via Initial residual and Identity mapping (GCNII) from Simple and Deep Graph Convolutional Networks 

Heterogeneous graph transformer convolution from Heterogeneous Graph Transformer 

Grouped reversible residual connections for GNNs, as introduced in Training Graph Neural Networks with 1000 Layers 

Equivariant Graph Convolutional Layer from E(n) Equivariant Graph Neural Networks 

Principal Neighbourhood Aggregation Layer from Principal Neighbourhood Aggregation for Graph Nets 

Directional Graph Network Layer from Directional Graph Networks 
CuGraph Conv Layersο
An accelerated relational graph convolution layer from Modeling Relational Data with Graph Convolutional Networks that leverages the highlyoptimized aggregation primitives in cugraphops. 

Graph attention layer from Graph Attention Networks, with the sparse aggregation accelerated by cugraphops. 

An accelerated GraphSAGE layer from Inductive Representation Learning on Large Graphs that leverages the highlyoptimized aggregation primitives in cugraphops: 
Dense Conv Layersο
Graph Convolutional layer from SemiSupervised Classification with Graph Convolutional Networks 

GraphSAGE layer from Inductive Representation Learning on Large Graphs 

Chebyshev Spectral Graph Convolution layer from Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering 
Global Pooling Layersο
Apply sum pooling over the nodes in a graph. 

Apply average pooling over the nodes in a graph. 

Apply max pooling over the nodes in a graph. 

Sort Pooling from An EndtoEnd Deep Learning Architecture for Graph Classification 

Compute importance weights for atoms and perform a weighted sum. 

Global Attention Pooling from Gated Graph Sequence Neural Networks 

Set2Set operator from Order Matters: Sequence to sequence for sets 

The Encoder module from Set Transformer: A Framework for Attentionbased PermutationInvariant Neural Networks 

The Decoder module from Set Transformer: A Framework for Attentionbased PermutationInvariant Neural Networks 
Score Modules for Link Prediction and Knowledge Graph Completionο
Predictor/score function for pairs of node representations 

Similarity measure from Translating Embeddings for Modeling Multirelational Data 

Similarity measure from Learning entity and relation embeddings for knowledge graph completion 
Heterogeneous Learning Modulesο
A generic module for computing convolution on heterogeneous graphs. 

Apply linear transformations on heterogeneous inputs. 

Create a heterogeneous embedding table. 

Linear transformation according to types. 
Utility Modulesο
A sequential container for stacking graph neural network modules 

Basis decomposition from Modeling Relational Data with Graph Convolutional Networks 

Layer that transforms one point set into a graph, or a batch of point sets with the same number of points into a batched union of those graphs. 

Layer that transforms one point set into a graph, or a batch of point sets with different number of points into a batched union of those graphs. 

Layer that transforms one point set into a bidirected graph with neighbors within given distance. 

The Jumping Knowledge aggregation module from Representation Learning on Graphs with Jumping Knowledge Networks 

Class for storing node embeddings. 

GNNExplainer model from GNNExplainer: Generating Explanations for Graph Neural Networks 

GNNExplainer model from GNNExplainer: Generating Explanations for Graph Neural Networks, adapted for heterogeneous graphs 

SubgraphX from On Explainability of Graph Neural Networks via Subgraph Explorations <https://arxiv.org/abs/2102.05152> 

SubgraphX from On Explainability of Graph Neural Networks via Subgraph Explorations, adapted for heterogeneous graphs 

PGExplainer from Parameterized Explainer for Graph Neural Network <https://arxiv.org/pdf/2011.04573> 

PGExplainer from Parameterized Explainer for Graph Neural Network, adapted for heterogeneous graphs 

Label Propagation from Learning from Labeled and Unlabeled Data with Label Propagation 
Network Embedding Modulesο
DeepWalk module from DeepWalk: Online Learning of Social Representations 

metapath2vec module from metapath2vec: Scalable Representation Learning for Heterogeneous Networks 
Utility Modules for Graph Transformerο
Degree Encoder, as introduced in Do Transformers Really Perform Bad for Graph Representation? 

Laplacian Positional Encoder (LPE), as introduced in GraphGPS: General Powerful Scalable Graph Transformers 

Path Encoder, as introduced in Edge Encoding of Do Transformers Really Perform Bad for Graph Representation? 

Spatial Encoder, as introduced in Do Transformers Really Perform Bad for Graph Representation? 

3D Spatial Encoder, as introduced in One Transformer Can Understand Both 2D & 3D Molecular Data 

Dense MultiHead Attention Module with Graph Attention Bias. 

Graphormer Layer with Dense MultiHead Attention, as introduced in Do Transformers Really Perform Bad for Graph Representation? 

EGTLayer for Edgeaugmented Graph Transformer (EGT), as introduced in `Global SelfAttention as a Replacement for Graph Convolution Reference `<https://arxiv.org/pdf/2108.03348.pdf>`_ 