-
Notifications
You must be signed in to change notification settings - Fork 55
Open
Labels
good first issueGood for newcomersGood for newcomers
Description
Checklist of stuff we miss compared to Deep Graph Library.
PRs are welcome!
Conv Layers
- GraphConv (called
GCNConv
here) - EdgeWeightNorm
- RelGraphConv
- TAGConv
- GATConv
- EdgeConv
- SAGEConv
- SGConv
- APPNPConv
- GINConv
- GatedGraphConv
- GMMConv (Added GMMConv #147)
- ChebConv
- AGNNConv
- NNConv
- AtomicConv
- CFConv
- DotGatConv
- TWIRLSConv
- TWIRLSUnfoldingAndAttention
- GCN2Conv
Dense Conv Layers
- DenseGraphConv
- DenseSAGEConv
- DenseChebConv
Global Pooling Layers
- SumPooling (
GlobalPooling(+)
here) - AvgPooling (
GlobalPooling(mean)
here) - MaxPooling (
GlobalPooling(max)
here) - SortPooling
- WeightAndSum
- GlobalAttentionPooling
- Set2Set
- SetTransformerEncoder
- SetTransformerDecoder
Batching and Reading Out Ops
https://docs.dgl.ai/en/0.6.x/api/python/dgl.html#batching-and-reading-out-ops
- batch. Use
Flux.batch
orSparseArrays.blockdiag
- unbatch
- readout_nodes (called
reduce_nodes
here) - readout_edges (called
reduce_edges
here) - sum_nodes # use reduce_nodes(+, g, x)
- sum_edges # use reduce_edges(+, g, x)
- mean_nodes
- mean_edges
- max_nodes
- max_edges
- softmax_nodes
- softmax_edges
- broadcast_nodes
- broadcast_edges
- topk_nodes
- topk_edges
Adjacency Related Utilities
- khop_adj
- laplacian_lambda_max
nn.functional
https://docs.dgl.ai/api/python/nn.functional.html
- edge_softmax (
softmax_edge_neighbors
here)
optim
https://docs.dgl.ai/api/python/dgl.optim.html
- Sparse Adam
- Sparse AdaGrad
nn Utility Modules
- Sequential (
GNNChain
here) - WeightBasis
- KNNGraph
- SegmentedKNNGraph
nn NodeEmbedding Module
- NodeEmbedding
Sampling and Stochastic training
.....
Distributed Training
....
learning-chip
Metadata
Metadata
Assignees
Labels
good first issueGood for newcomersGood for newcomers