Sökning: L773:0306 4573 OR L773:1873 5371 >
Learning optimal in...
Learning optimal inter-class margin adaptively for few-shot class-incremental learning via neural collapse-based meta-learning
-
- Ran, Hang (författare)
- Chinese Academy Of Sciences, Beijing, China; University Of Chinese Academy Of Sciences, Beijing, China
-
- Li, Weijun (författare)
- Chinese Academy Of Sciences, Beijing, China; University Of Chinese Academy Of Sciences, Beijing, China
-
- Li, Lusi (författare)
- Old Dominion University, Norfolk, United States
-
visa fler...
-
- Tian, Songsong (författare)
- Chinese Academy Of Sciences, Beijing, China; University Of Chinese Academy Of Sciences, Beijing, China
-
- Ning, Xin (författare)
- Chinese Academy Of Sciences, Beijing, China; University Of Chinese Academy Of Sciences, Beijing, China; Cognitive Computing Technology Joint Laboratory, Beijing, China
-
- Tiwari, Prayag, 1991- (författare)
- Högskolan i Halmstad,Akademin för informationsteknologi
-
visa färre...
-
(creator_code:org_t)
- London : Elsevier, 2024
- 2024
- Engelska.
-
Ingår i: Information Processing & Management. - London : Elsevier. - 0306-4573 .- 1873-5371. ; 61:3
- Relaterad länk:
-
https://doi.org/10.1...
-
visa fler...
-
https://urn.kb.se/re...
-
https://doi.org/10.1...
-
visa färre...
Abstract
Ämnesord
Stäng
- Few-Shot Class-Incremental Learning (FSCIL) aims to learn new classes incrementally with a limited number of samples per class. It faces issues of forgetting previously learned classes and overfitting on few-shot classes. An efficient strategy is to learn features that are discriminative in both base and incremental sessions. Current methods improve discriminability by manually designing inter-class margins based on empirical observations, which can be suboptimal. The emerging Neural Collapse (NC) theory provides a theoretically optimal inter-class margin for classification, serving as a basis for adaptively computing the margin. Yet, it is designed for closed, balanced data, not for sequential or few-shot imbalanced data. To address this gap, we propose a Meta-learning- and NC-based FSCIL method, MetaNC-FSCIL, to compute the optimal margin adaptively and maintain it at each incremental session. Specifically, we first compute the theoretically optimal margin based on the NC theory. Then we introduce a novel loss function to ensure that the loss value is minimized precisely when the inter-class margin reaches its theoretically best. Motivated by the intuition that “learn how to preserve the margin” matches the meta-learning's goal of “learn how to learn”, we embed the loss function in base-session meta-training to preserve the margin for future meta-testing sessions. Experimental results demonstrate the effectiveness of MetaNC-FSCIL, achieving superior performance on multiple datasets. The code is available at https://github.com/qihangran/metaNC-FSCIL. © 2024 The Author(s)
Ämnesord
- NATURVETENSKAP -- Data- och informationsvetenskap (hsv//swe)
- NATURAL SCIENCES -- Computer and Information Sciences (hsv//eng)
Nyckelord
- Few-shot class-incremental learning
- Meta-learning
- Neural collapse
Publikations- och innehållstyp
- ref (ämneskategori)
- art (ämneskategori)
Hitta via bibliotek
Till lärosätets databas