当前位置:网站首页>GNN (graph neural network) introduction vernacular
GNN (graph neural network) introduction vernacular
2022-06-26 01:37:00 【Ming Qi】
function : Node classification And graph classification
airspace : Space A model considering graph structure , That is, consider the geometric relationship between the target node and other nodes ( Is there a connection ).
The model represents :GAT(Graph Attention Networks) Figure attention model
use Attention mechanism Weighted sum of adjacent node features . The weight of adjacent node features depends entirely on the node features , Independent of graph structure .
( take Convolutional neural networks
Pooling in is regarded as a special average weighted Attention mechanism
, In other words, attention mechanism is a general pooling method with preference for input allocation ( Pooling method with parameters )
chart 1: Figure attention network diagram and update formula
Some explanations of the above formula :
The formula (1) Yes l Layer node embedding
We did a linear transformation ,W^((l)) Is the trainable parameter of the transformation
The formula (2) The original attention scores between paired nodes are calculated . It first splices the two nodes z The embedded , Be careful || It means splicing here ; Then we embed the spliced weight vector and a learnable weight vector Make some product ; Finally, a LeakyReLU Activation function . This form of attention mechanism is often referred to as additive attention , The difference in Transformer Focus on the points in the .
The formula (3) For the original attention score obtained by all the edges of a node, a softmax operation , Got the attention weight .
The formula (4) Shaped like a GCN Node feature update rule of , The weighted sum based on attention is made for the features of all adjacent nodes .
frequency domain :
The model represents :GCN(Graph Convolutional Network ) Figure convolution network
advantage : Provincial parameters
shortcoming : Not easy to work on dynamic graphs
( For neighborhoods of the same order, the weights assigned to different neighbors are exactly the same ( It is not allowed to assign different weights to different nodes in the neighborhood )
A graph convolution operation Contains the Normalized sum of adjacent node features :
among N(i) Is for the node i A distance of 1 A collection of adjacent nodes . We usually add a connection node i And its own edges make i Itself is also included in N(i) in .
Is a normalized constant based on graph structure ;
σ It's an activation function (GCN Used ReLU);
W^((l)) Is the weight matrix of node feature transformation , Shared by all nodes .
because c_ij Related to the mechanism of the figure , Make what you learn from a picture GCN It is difficult to directly apply the model to another diagram .
Common steps :
- machining Graph adjacency matrix
- Yes Graph adjacency matrix Characteristics of decomposition , Get the eigenvalue ,
- The core difference between ( how Collect and accumulate a distance of 1 Characteristic representation of neighbor nodes of )
- Treat the eigenvector as a constant , And the convolution kernel acts on the eigenvalues
GAT The fixed standardized operation in graph convolution is replaced by attention mechanism , Will be the original Normalized constant Replace with Neighborhood node feature aggregation function using attention weight .
Long attention (Multi-head attention)
It resembles the multi-channel in convolutional neural network ,GAT Multi head attention is introduced to enrich the ability of the model and stabilize the training process . Every attention head has its own parameters . There are generally two ways to integrate the output of multiple attention mechanisms :
In the above formula K It's the number of attention heads . The authors suggest using stitching for the middle layer and averaging for the last layer .
边栏推荐
- Embedded c learning notes
- 26. histogram back projection
- 填鸭数据即时收集解决方案资源
- Solution to MySQL error code 2003
- Technical foreword - metauniverse
- 黑盒测试 — 测试用例 之 判定表法看这一篇就够了
- From query database performance optimization to redis cache - talk about cache penetration, avalanche and breakdown
- Obtain WiFi password through computer (only connected WiFi)
- Essence and thoughts of 30 lectures on product thinking
- MySQL example - comprehensive case (multi condition combined query)
猜你喜欢
When you run the demo using the gin framework, there is an error "listen TCP: 8080: bind: an attempt was made to access a socket in a way forbidden"
数据分析——切片器、数据透视表与数据透视图(职场必备)
20. Hough line transformation
Idea configuration
25. histogram comparison
MOS管防倒灌电路设计及其过程分析
[excel knowledge and skills] Excel data type
新库上线 | CnOpenData中国新房信息数据
Quickly generate 1~20 natural numbers and easily copy
Musk vs. jobs, who is the greatest entrepreneur in the 21st century
随机推荐
填鸭数据即时收集解决方案资源
15 `bs object Node name Node name String` get nested node content
JSON简介
Handling of @charset UTF-8 warning problems during vite packaging and construction;
2022 Anhui province safety officer C certificate examination practice questions simulated examination platform operation
SPI protocol
新库上线 | CnOpenData中国新房信息数据
Zhihuijia - full furniture function
New library launched | cnopendata China new house information data
A sharp tool for information collection, Google hacker syntax
When you run the demo using the gin framework, there is an error "listen TCP: 8080: bind: an attempt was made to access a socket in a way forbidden"
数据分析——切片器、数据透视表与数据透视图(职场必备)
Procédure de désinstallation complète de la base de données Oracle (pas de capture d'écran)
Is it safe to open a securities account online
27. template match
Summary of informer's paper
Tools - API document generation tool
Web information collection, naked runners on the Internet
Summary of knowledge points of catboost
Digital circuit - adder