当前位置:网站首页>Self organizing map neural network (SOM)
Self organizing map neural network (SOM)
2022-06-23 08:23:00 【calm-one】
List of articles
First, understand what is self-organizing mapping neural network from the graph SOM(Self-Organising Map)?![[ Failed to transfer the external chain picture , The origin station may have anti-theft chain mechanism , It is suggested to save the pictures and upload them directly (img-EzMEhZk3-1655637078260)(%E8%87%AA%E7%BB%84%E7%BB%87%E7%A5%9E%E7%BB%8F%E7%BD%91%E7%BB%9C%EF%BC%88SOM%EF%BC%89%207016103b4efd4ab28ba5ff8fd512b2cc/Untitled.png)]](/img/e8/edc41633fe492c0ba2f0c88b5f08a1.png)
Network structure : Input layer + Output layer , The number of neurons in the input layer is the same as the characteristic dimension of a sample , The output layer is defined ( ha-ha ), Let's first understand this diagram , First, a sample is mapped to a node of the output layer , Its surrounding nodes ? Radiation effects , In short, it is similar to this node . For a class of samples, they will be mapped to the specific nodes of the output layer , So as to achieve clustering effect .
1. background
Self organizing map neural network (Self-Organizing Map,SOM) yes Unsupervised learning methods An important class of methods in , Can be used as clustering 、 High dimensional visualization 、 data compression 、 feature extraction And so on , It incorporates a large number of signal processing mechanisms of human brain neurons , It has unique structural characteristics .
The model was developed by a professor at the University of Helsinki in Finland Teuvo Kohonen On 1981 in , So it's also called Kohonen The Internet .Kohonen Think : When a neural network accepts external input patterns , will Divided into different corresponding areas , Each area has a... For the input mode Different response characteristics , And this process is done automatically . According to this view, self-organizing feature mapping is proposed , Its characteristics are similar to the self-organization characteristics of human brain .
Self organizing map neural network is essentially a Two layer neural network , Including input layer and output layer ( Competitive layer ). The input layer simulates the retina sensing external input information , The output layer simulates the response of the cerebral cortex .** The number of neurons in the output layer is usually the number of clusters , Represents each class that needs to be aggregated .** Training with “ Competitive learning ” The way , Each input sample finds a node in the output layer that best matches it , It is called the activation node , Also called winning neuron; Then the random gradient descent method is used to update the parameters of the active node ; meanwhile , The points adjacent to the active node also update the parameters appropriately according to their distance from the active node . This competition can be through the lateral inhibitory connections between neurons ( Negative feedback path ) To achieve . The output layer nodes of self-organizing map neural network have topological relations . This topological relationship is determined according to requirements , If you want a one-dimensional model , Then the hidden node can be “ One dimensional linear array ”; If you want a two-dimensional topological relationship , Then it will be a “ Two dimensional plane array ”, Pictured 5.8 Shown . There are also higher dimensional topological relationships , such as “ 3D grid array ”, But it's not common .
First SOM Is usually a two-layer network structure analog input space , Each sample corresponds to an active node ; secondly SOM The relationship between output layer nodes focuses on nodes in a neighborhood , Equivalent to a similar class of samples
The picture below is 1 Peace-keeping 2 Two of dimensions SOM Network diagram
2. Algorithm flow
Suppose the input space is D dimension , Enter the sample set { x i ∣ x i ∈ R d , i = 1 , 2 , ⋯ , n } \{x_i|x_i\in \mathbb{R}^{d},i=1,2,\cdots,n\} { xi∣xi∈Rd,i=1,2,⋯,n}, among n n n Is the number of samples . The connection weight is { w i ∣ w i ∈ R d , i = 1 , 2 , ⋯ , N } \{w_{i}|w_i\in\mathbb{R}^{d}, i=1,2,\cdots,N\} { wi∣wi∈Rd,i=1,2,⋯,N}, among N N N Is the number of output layer nodes
![[ Failed to transfer the external chain picture , The origin station may have anti-theft chain mechanism , It is suggested to save the pictures and upload them directly (img-6bnRfO0W-1655637078262)(%E8%87%AA%E7%BB%84%E7%BB%87%E7%A5%9E%E7%BB%8F%E7%BD%91%E7%BB%9C%EF%BC%88SOM%EF%BC%89%207016103b4efd4ab28ba5ff8fd512b2cc/Untitled%201.png)]](/img/97/0d6366c4350d76faae900a8ca4757f.png)
Understanding learning : The features between data samples are fitted by two-dimensional morphology , So that a class of samples is mapped to a node , So SOM The algorithm is an adaptive clustering algorithm , It also achieves the goal of dimensionality reduction , Here, it is said that the high-dimensional distance is closer, so the mapped distance is also closer , Of course, this is only a perceptual understanding , Core essence : Make the weight of the output node consistent with the sample as much as possible , That is, the learning of a certain kind of samples .
![[ Failed to transfer the external chain picture , The origin station may have anti-theft chain mechanism , It is suggested to save the pictures and upload them directly (img-FPoUb5r6-1655637078263)(%E8%87%AA%E7%BB%84%E7%BB%87%E7%A5%9E%E7%BB%8F%E7%BD%91%E7%BB%9C%EF%BC%88SOM%EF%BC%89%207016103b4efd4ab28ba5ff8fd512b2cc/Untitled%202.png)]](/img/72/85b6e402e6890021fdb33b35c839da.png)
The weight is updated by gradient descent algorithm , It should be easy to understand , Realize the optimization of parameters .
Specific details :
![[ Failed to transfer the external chain picture , The origin station may have anti-theft chain mechanism , It is suggested to save the pictures and upload them directly (img-BvVWscAJ-1655637078263)(%E8%87%AA%E7%BB%84%E7%BB%87%E7%A5%9E%E7%BB%8F%E7%BD%91%E7%BB%9C%EF%BC%88SOM%EF%BC%89%207016103b4efd4ab28ba5ff8fd512b2cc/Untitled%203.png)]](/img/66/31ca78b138b359518dfce73a998395.png)
Reference link :
- Thesis link
- 《 Baimian machine learning 》
- Image from :Python And artificial intelligence -SOM- Self organizing mapping networks -1_ Bili, Bili _bilibili
边栏推荐
- Vulnhub | DC: 3 |【实战】
- C Scrollview scroll up or scroll down
- Imperva- method of finding regular match timeout
- Go 数据类型篇(二)之Go 支持的数据类型概述及布尔类型
- The kernel fails to shut down when the easygbs program stops. How should I optimize it? [code attached]
- 论文阅读【Quo Vadis, Action Recognition? A New Model and the Kinetics Dataset】
- 走好数据中台最后一公里,为什么说数据服务API是数据中台的标配?
- 4-绘制椭圆、使用定时器
- Copy image bitmap by C # memory method
- 复选框的基本使用与实现全选和反选功能
猜你喜欢

点云库pcl从入门到精通 第十章

论文阅读【Quo Vadis, Action Recognition? A New Model and the Kinetics Dataset】

typeScript的介绍与变量定义的基本类型

自组织映射神经网络(SOM)

坑爹的“敬业福”:支付宝春晚红包技术大爆发

为什么用生长型神经气体网络(GNG)?

PCB电路板特性检查项目都有哪些?

值得反复回味的81句高人高语

The rtsp/onvif protocol video platform easynvr startup service reports an error "service not found". How to solve it?

INT 104_ LEC 06
随机推荐
顺序表课设
观察者模式
生产环境服务器环境搭建+项目发布流程
What are the PCB characteristics inspection items?
Integers and operators in go data types (3)
[try to hack] IP address
Implementation principle and source code analysis of ThreadPoolExecutor thread pool
Commonly used bypass methods for SQL injection -ctf
Do not put files with garbled names into the CFS of NFS protocol
2 corrections de bogues dans l'outil aquatone
2-用线段构成图形、坐标转换
复选框的基本使用与实现全选和反选功能
十多年前的入职第一天
C#重启应用程序
最常用的5中流ETL模式
C RichTextBox controls the maximum number of rows
Map接口及其子实现类
谈谈 @Autowired 的实现原理
坑爹的“敬业福”:支付宝春晚红包技术大爆发
APM performance monitoring practice of jubasha app