N1H111SM's Miniverse

2020/02/28 Share

Materials

# Architecture

Discussion and Future Work

The time complexity of a single GAT attention head computing $F^\prime$ features may be expressed as $O(|V|FF^\prime + |E|F^\prime)​$, where F is the number of input features. 因为这仅仅是一个head的计算复杂度，可以看到还是相对较高的。

• overcoming the practical problems to be able to handle larger batch sizes.
• take advantage of the attention mechanism to perform a thorough analysis on the model interpretability.
• extend the model to incorporate edge features (possibly indicating relationship among nodes) .
CATALOG