当前位置:网站首页>Learning to Generalize Unseen Domains via Memory-based Multi-Source Meta-Learning for Person Re-ID
Learning to Generalize Unseen Domains via Memory-based Multi-Source Meta-Learning for Person Re-ID
2022-06-26 09:23:00 【_ Summer tree】
Methods an overview
Learning generates pedestrian re recognition in unknown domain through memory based multi-source meta learning .
1, This paper proposes a multi-source domain meta learning framework , Can mimic domain generation (DG) Training for - Testing process . This method enhances the domain independent representation of model learning and the generalization ability .
2, The article is equipped with memory-based modular , The loss of identity is realized in a non parametric way , It can prevent unstable optimization caused by traditional parameter methods .
3, Put forward MetaBN To generate various meta test features , These features can be directly injected into our meta learning framework , And get further improvement .
List of articles
Contents summary
Title of thesis | abbreviation | meeting / Periodical | Year of publication | baseline | backbone | Data sets |
---|---|---|---|---|---|---|
Learning to Generalize Unseen Domains via Memory-based Multi-Source Meta-Learning for Person Re-Identification. | M3L | CVPR | 2021 | 【QAConv50】Shengcai Liao and Ling Shao. Interpretable and generaliz- able person re-identification with query-adaptive convolution and temporal lifting. In ECCV, 2020. 2, 6, 7 | ResNet-50 [11] and IBN-Net50 [28] | Market-1501 [46], DukeMTMC-reID [30, 48], CUHK03 [20, 49] and MSMT17 [41] |
Online links :https://openaccess.thecvf.com/content/CVPR2021/html/Zhao_Learning_to_Generalize_Unseen_Domains_via_Memory-based_Multi-Source_Meta-Learning_for_CVPR_2021_paper.html
Source link : https://github.com/HeliosZhao/M3L
Work Overview
1, we study the problem ofmulti- source domain generalization in ReID, which aims to learn a model that can perform well on unseen domains with only several labeled source domains.
2, we propose the Memory-based Multi-Source Meta- Learning (M3L) framework to train a generalizable model for unseen domains. Specifically, a meta-learning strat- egy is introduced to simulate the train-test process of do- main generalization for learning more generalizable mod- els.
3, we propose a memory-based iden- tification loss that is non-parametric and harmonizes with meta-learning.
4, We also present a meta batch normaliza- tion layer (MetaBN) to diversify meta-test features, further establishing the advantage of meta-learning.
Summary of results
our M3L can effectively enhance the gen- eralization ability ofthe model for unseen domains and can outperform the state-of-the-art methods on four large-scale ReID datasets
Methods,
Methods the framework
Algorithm description
Concrete realization
1, As can be seen from the frame diagram , The main method of the article is meta - learning , Only source domain information is used , Divide multiple source domains into two parts: meta training and meta testing . In the network to join the innovative MetaBN and Memory_based The design of the .
2, Finally, network optimization combines meta training and testing , Formalized as a formula 1 Shown .
3, Memory_based Mainly to achieve a parameterless goal , This paper analyzes the disadvantages of the two forms of reference , It is considered that this nonparametric form is needed in meta learning , And make use of Memory Compare similarity with features , obtain Memory_based Loss of identity , As formula 3 Shown .
4, Memory The initialization of is to take the class characteristic mean , The update is a formula 2.
5, In addition to using Memory_based In addition to the loss of identity , Still used Triplet loss( The formula 4), This is useful in many ways , The article will not go into detail .
6, About MetaBN, The design is shown in the picture 4 Shown . What I understand is to combine the samples of meta test with the samples of meta training , And again Memory Find the similar loss . Sample a feature for each meta training domain, such as formula 5 Shown , Then it is mixed with the features of meta test samples to get the new features after mixing ( The formula 6 )Z yes zi The combination of , Finally, the mixed features will be normalize( The formula 7).
7, The loss function of a single field in meta training is shown in the formula 8 Shown , Global functions such as formulas 9 Shown .
8, The loss function of meta test is as follows 10 Shown .
experimental result
Overall evaluation
1, Poor performance , But in CVPR, I think the paper is well written , from contributions Look at , The results are well packed .
2, The innovation sounds magnificent , And make outstanding contributions .
3, From the writing of the article , The thesis work is quite solid , There is also a lot of information . After reading, understand why the performance is poor , Because it has no target domain data , This is equivalent to a open-set The problem of , The work is quite valuable .
Small sample learning and intelligent frontier ( below ↓ official account ) The background to reply “M3L", The paper electronic resources can be obtained .
Citation format
@inproceedings{DBLP:conf/cvpr/ZhaoZYLLLS21,
author = {Yuyang Zhao and
Zhun Zhong and
Fengxiang Yang and
Zhiming Luo and
Yaojin Lin and
Shaozi Li and
Nicu Sebe},
title = {Learning to Generalize Unseen Domains via Memory-based Multi-Source
Meta-Learning for Person Re-Identification},
booktitle = { {CVPR}},
pages = {6277–6286},
publisher = {Computer Vision Foundation / {IEEE}},
year = {2021}
}
边栏推荐
- "One week's work on digital power" -- encoder and decoder
- Phpcms applet plug-in tutorial website officially launched
- Course paper: Copula modeling code of portfolio risk VaR
- 【CVPR 2021】Unsupervised Pre-training for Person Re-identification(UPT)
- Srv6---is-is extension
- Principle and application of single chip microcomputer -- Overview
- Load other related resources or configurations (promise application of the applet) before loading the homepage of the applet
- 首期Techo Day腾讯技术开放日,628等你
- Behavior tree file description
- Particles and sound effect system in games104 music 12 game engine
猜你喜欢
Merrill Lynch data tempoai is new!
How to solve the sample imbalance problem in machine learning?
Notes on setting qccheckbox style
《一周搞定模电》-光耦等元器件
Regular expression
Nacos注册表结构和海量服务注册与并发读写原理 源码分析
《单片机原理及应用》——概述
JSON file to XML file
Upgrade phpcms applet plug-in API interface to 4.3 (add batch acquisition interface, search interface, etc.)
Param in the paper
随机推荐
Edit type information
"One week to solve the model electricity" - negative feedback
"One week's work on Analog Electronics" - Basic amplification circuit
【CVPR 2021】Unsupervised Pre-training for Person Re-identification(UPT)
"One week's data collection" -- combinational logic circuit
Self learning neural network series - 8 feedforward neural networks
《單片機原理及應用》——概述
Self taught neural network series - 4 learning of neural network
Phpcms V9 background article list adds one click push to Baidu function
The most complete and simple nanny tutorial: deep learning environment configuration anaconda+pychart+cuda+cudnn+tensorflow+pytorch
What is optimistic lock and what is pessimistic lock
Self taught neural network series - 1 Basic programming knowledge
Construction practice of bank intelligent analysis and decision-making platform
51单片机ROM和RAM
Solutions for safety management and control at the operation site
Classified catalogue of high quality sci-tech periodicals in the field of computing
Regular expression
Particles and sound effect system in games104 music 12 game engine
"One week's data collection" - logic gate
Basic concept and advanced level of behavior tree