当前位置:网站首页>Non local mean filtering / attention mechanism

Non local mean filtering / attention mechanism

2022-07-23 20:04:00 xx_ xjm

One : Nonlocal mean

Mean filtering : In target pixels x Centered , Its radius is r Pixels in the range of weighting Sum and average as pixels x Filtered value

Nonlocal mean filtering : Mean filtering is for target pixels x Weighted sum of pixels in the range , But this weight is artificially set , Generally, it is to take 1, That is to say, the pixels in this range are relative to the center x The impact is the same , This is obviously not right . Then each pixel point to the center point x How to set the weight of ? Nonlocal mean filtering is actually to calculate the pixel points at different positions to the center point x The impact weight of , Then add and average . How to calculate this weight ?

Refer to the following first : Nonlocal mean filtering ( One )_ Shallow Yumo's blog -CSDN Blog _ Nonlocal mean filtering

The article has made it very clear , But I don't think what I said is very good , Here he sets the size of the picture to 7*7, And the search box is also 7*7, It's not easy to understand , Let me add here .

We can set the size of the picture to 256*256, So we set the search box to 7*7, The domain block is set to 3*3; That is to say, we need to calculate the search box 7*7 Inside 49 Pixels to the center of the search box x The impact weight of , This weight is calculated by comparing the field speed of each point in the search box with that of the center point

Two : Attention mechanism

There are two kinds of attention mechanisms : One is the soft attention mechanism , One is the hard attention mechanism , A simple understanding can think that the attention learned by soft attention is continuous [0,1] Probability value between , The attention learned by hard attention mechanism can only be 0, perhaps 1. Generally, we use soft attention , Hard attention usually needs reinforcement learning .

Reference from :Attention( attention ) Mechanism - Simple books (jianshu.io)

Classification of attention mechanisms | Soft Attention and Hard Attention_Ftwhale The blog of -CSDN Blog _ Classification of attention mechanism

CV Medium Attention Mechanism summary _ A cat who has always been a maverick 1994 The blog of -CSDN Blog _cv Medium attention

Soft attention mechanism : For example, calculate the attention relationship between channels , The representative is SENet; Or attention between regions , Channel attention , Spatial attention, etc .

Hard attention mechanism : Calculate point-to-point attention , The representative is Non-local Neural Networks This paper , yes self-attention A typical application of , Its idea is derived from the nonlocal mean filtering we mentioned above .

One more point : Attention can be understood as a way to calculate the interaction factors between elements , and non-local It's a way of calculation : For example, in the paper Camouflflaged Object Segmentation with Distraction Mining ” As mentioned in :

It consists of a channel attention block and a spatial attention block. Both of them are implemented in a non-local way
Add : Channel attention and spatial attention
Channel attention : Calculate the weight of the interaction between channels , The final weight reflects which channel is more important
Spatial attention : Which part of the computing space is more important , The final weight reflects which part of the pixel is more important
原网站

版权声明
本文为[xx_ xjm]所创,转载请带上原文链接,感谢
https://yzsam.com/2022/204/202207231837056604.html