LapPEο
- class dgl.transforms.LapPE(k, feat_name='PE', eigval_name=None, padding=False)[source]ο
Bases:
BaseTransform
Laplacian Positional Encoding, as introduced in Benchmarking Graph Neural Networks
This module only works for homogeneous bidirected graphs.
- Parameters:
k (int) β Number of smallest non-trivial eigenvectors to use for positional encoding.
feat_name (str, optional) β Name to store the computed positional encodings in ndata.
eigval_name (str, optional) β If None, store laplacian eigenvectors only. Otherwise, itβs the name to store corresponding laplacian eigenvalues in ndata. Default: None.
padding (bool, optional) β If False, raise an exception when k>=n. Otherwise, add zero paddings in the end of eigenvectors and βnanβ paddings in the end of eigenvalues when k>=n. Default: False. n is the number of nodes in the given graph.
Example
>>> import dgl >>> from dgl import LapPE >>> transform1 = LapPE(k=3) >>> transform2 = LapPE(k=5, padding=True) >>> transform3 = LapPE(k=5, feat_name='eigvec', eigval_name='eigval', padding=True) >>> g = dgl.graph(([0,1,2,3,4,2,3,1,4,0], [2,3,1,4,0,0,1,2,3,4])) >>> g1 = transform1(g) >>> print(g1.ndata['PE']) tensor([[ 0.6325, 0.1039, 0.3489], [-0.5117, 0.2826, 0.6095], [ 0.1954, 0.6254, -0.5923], [-0.5117, -0.4508, -0.3938], [ 0.1954, -0.5612, 0.0278]]) >>> g2 = transform2(g) >>> print(g2.ndata['PE']) tensor([[-0.6325, -0.1039, 0.3489, -0.2530, 0.0000], [ 0.5117, -0.2826, 0.6095, 0.4731, 0.0000], [-0.1954, -0.6254, -0.5923, -0.1361, 0.0000], [ 0.5117, 0.4508, -0.3938, -0.6295, 0.0000], [-0.1954, 0.5612, 0.0278, 0.5454, 0.0000]]) >>> g3 = transform3(g) >>> print(g3.ndata['eigval']) tensor([[0.6910, 0.6910, 1.8090, 1.8090, nan], [0.6910, 0.6910, 1.8090, 1.8090, nan], [0.6910, 0.6910, 1.8090, 1.8090, nan], [0.6910, 0.6910, 1.8090, 1.8090, nan], [0.6910, 0.6910, 1.8090, 1.8090, nan]]) >>> print(g3.ndata['eigvec']) tensor([[ 0.6325, -0.1039, 0.3489, 0.2530, 0.0000], [-0.5117, -0.2826, 0.6095, -0.4731, 0.0000], [ 0.1954, -0.6254, -0.5923, 0.1361, 0.0000], [-0.5117, 0.4508, -0.3938, 0.6295, 0.0000], [ 0.1954, 0.5612, 0.0278, -0.5454, 0.0000]])