当前位置:网站首页>Self organizing map neural network (SOM)

Self organizing map neural network (SOM)

2022-06-23 08:23:00 calm-one

First, understand what is self-organizing mapping neural network from the graph SOM(Self-Organising Map)?
[ Failed to transfer the external chain picture , The origin station may have anti-theft chain mechanism , It is suggested to save the pictures and upload them directly (img-EzMEhZk3-1655637078260)(%E8%87%AA%E7%BB%84%E7%BB%87%E7%A5%9E%E7%BB%8F%E7%BD%91%E7%BB%9C%EF%BC%88SOM%EF%BC%89%207016103b4efd4ab28ba5ff8fd512b2cc/Untitled.png)]

Network structure : Input layer + Output layer , The number of neurons in the input layer is the same as the characteristic dimension of a sample , The output layer is defined ( ha-ha ), Let's first understand this diagram , First, a sample is mapped to a node of the output layer , Its surrounding nodes ? Radiation effects , In short, it is similar to this node . For a class of samples, they will be mapped to the specific nodes of the output layer , So as to achieve clustering effect .

1. background

Self organizing map neural network (Self-Organizing Map,SOM) yes Unsupervised learning methods An important class of methods in , Can be used as clustering 、 High dimensional visualization 、 data compression 、 feature extraction And so on , It incorporates a large number of signal processing mechanisms of human brain neurons , It has unique structural characteristics .

The model was developed by a professor at the University of Helsinki in Finland Teuvo Kohonen On 1981 in , So it's also called Kohonen The Internet .Kohonen Think : When a neural network accepts external input patterns , will Divided into different corresponding areas , Each area has a... For the input mode Different response characteristics , And this process is done automatically . According to this view, self-organizing feature mapping is proposed , Its characteristics are similar to the self-organization characteristics of human brain .

Self organizing map neural network is essentially a Two layer neural network , Including input layer and output layer ( Competitive layer ). The input layer simulates the retina sensing external input information , The output layer simulates the response of the cerebral cortex .** The number of neurons in the output layer is usually the number of clusters , Represents each class that needs to be aggregated .** Training with “ Competitive learning ” The way , Each input sample finds a node in the output layer that best matches it , It is called the activation node , Also called winning neuron; Then the random gradient descent method is used to update the parameters of the active node ; meanwhile , The points adjacent to the active node also update the parameters appropriately according to their distance from the active node . This competition can be through the lateral inhibitory connections between neurons ( Negative feedback path ) To achieve . The output layer nodes of self-organizing map neural network have topological relations . This topological relationship is determined according to requirements , If you want a one-dimensional model , Then the hidden node can be “ One dimensional linear array ”; If you want a two-dimensional topological relationship , Then it will be a “ Two dimensional plane array ”, Pictured 5.8 Shown . There are also higher dimensional topological relationships , such as “ 3D grid array ”, But it's not common .

First SOM Is usually a two-layer network structure analog input space , Each sample corresponds to an active node ; secondly SOM The relationship between output layer nodes focuses on nodes in a neighborhood , Equivalent to a similar class of samples

The picture below is 1 Peace-keeping 2 Two of dimensions SOM Network diagram

2. Algorithm flow

Suppose the input space is D dimension , Enter the sample set { x i ∣ x i ∈ R d , i = 1 , 2 , ⋯   , n } \{x_i|x_i\in \mathbb{R}^{d},i=1,2,\cdots,n\} { xixiRd,i=1,2,,n}, among n n n Is the number of samples . The connection weight is { w i ∣ w i ∈ R d , i = 1 , 2 , ⋯   , N } \{w_{i}|w_i\in\mathbb{R}^{d}, i=1,2,\cdots,N\} { wiwiRd,i=1,2,,N}, among N N N Is the number of output layer nodes

[ Failed to transfer the external chain picture , The origin station may have anti-theft chain mechanism , It is suggested to save the pictures and upload them directly (img-6bnRfO0W-1655637078262)(%E8%87%AA%E7%BB%84%E7%BB%87%E7%A5%9E%E7%BB%8F%E7%BD%91%E7%BB%9C%EF%BC%88SOM%EF%BC%89%207016103b4efd4ab28ba5ff8fd512b2cc/Untitled%201.png)]

Understanding learning : The features between data samples are fitted by two-dimensional morphology , So that a class of samples is mapped to a node , So SOM The algorithm is an adaptive clustering algorithm , It also achieves the goal of dimensionality reduction , Here, it is said that the high-dimensional distance is closer, so the mapped distance is also closer , Of course, this is only a perceptual understanding , Core essence : Make the weight of the output node consistent with the sample as much as possible , That is, the learning of a certain kind of samples .

[ Failed to transfer the external chain picture , The origin station may have anti-theft chain mechanism , It is suggested to save the pictures and upload them directly (img-FPoUb5r6-1655637078263)(%E8%87%AA%E7%BB%84%E7%BB%87%E7%A5%9E%E7%BB%8F%E7%BD%91%E7%BB%9C%EF%BC%88SOM%EF%BC%89%207016103b4efd4ab28ba5ff8fd512b2cc/Untitled%202.png)]

The weight is updated by gradient descent algorithm , It should be easy to understand , Realize the optimization of parameters .

Specific details :

[ Failed to transfer the external chain picture , The origin station may have anti-theft chain mechanism , It is suggested to save the pictures and upload them directly (img-BvVWscAJ-1655637078263)(%E8%87%AA%E7%BB%84%E7%BB%87%E7%A5%9E%E7%BB%8F%E7%BD%91%E7%BB%9C%EF%BC%88SOM%EF%BC%89%207016103b4efd4ab28ba5ff8fd512b2cc/Untitled%203.png)]

Reference link :

  1. Thesis link
  2. 《 Baimian machine learning 》
  3. Image from :Python And artificial intelligence -SOM- Self organizing mapping networks -1_ Bili, Bili _bilibili
原网站

版权声明
本文为[calm-one]所创,转载请带上原文链接,感谢
https://yzsam.com/2022/174/202206230756386507.html