object GraphAttention
- Companion:
- class
Type members
Value members
Concrete methods
def apply[S : Sc](nodeDim: Int, edgeDim: Int, attentionKeyHiddenDimPerHead: Int, attentionNumHeads: Int, valueDimPerHead: Int, dropout: Double, tOpt: STenOptions, dotProductAttention: Boolean, nonLinearity: Boolean): GraphAttention
Graph Attention Network https://arxiv.org/pdf/1710.10903.pdf Non-linearity in eq 4 and dropout is not applied to the final vertex activations
Graph Attention Network https://arxiv.org/pdf/1710.10903.pdf Non-linearity in eq 4 and dropout is not applied to the final vertex activations
Needs self edges to be already present in the graph
- Returns:
next node representation (without relu, dropout) and a tensor with the original node and edge features ligned up like [N_i, N_j, E_ij]