Sökning: onr:"swepub:oai:DiVA.org:hh-52738" >
Learning optimal in...
-
Ran, HangChinese Academy Of Sciences, Beijing, China; University Of Chinese Academy Of Sciences, Beijing, China
(författare)
Learning optimal inter-class margin adaptively for few-shot class-incremental learning via neural collapse-based meta-learning
- Artikel/kapitelEngelska2024
Förlag, utgivningsår, omfång ...
-
London :Elsevier,2024
-
printrdacarrier
Nummerbeteckningar
-
LIBRIS-ID:oai:DiVA.org:hh-52738
-
https://urn.kb.se/resolve?urn=urn:nbn:se:hh:diva-52738URI
-
https://doi.org/10.1016/j.ipm.2024.103664DOI
Kompletterande språkuppgifter
-
Språk:engelska
-
Sammanfattning på:engelska
Ingår i deldatabas
Klassifikation
-
Ämneskategori:ref swepub-contenttype
-
Ämneskategori:art swepub-publicationtype
Anmärkningar
-
his work is supported by the National Natural Science Foundation of China (No. 62373343); and the Beijing Natural Science Foundation, China (No. L233036).
-
Few-Shot Class-Incremental Learning (FSCIL) aims to learn new classes incrementally with a limited number of samples per class. It faces issues of forgetting previously learned classes and overfitting on few-shot classes. An efficient strategy is to learn features that are discriminative in both base and incremental sessions. Current methods improve discriminability by manually designing inter-class margins based on empirical observations, which can be suboptimal. The emerging Neural Collapse (NC) theory provides a theoretically optimal inter-class margin for classification, serving as a basis for adaptively computing the margin. Yet, it is designed for closed, balanced data, not for sequential or few-shot imbalanced data. To address this gap, we propose a Meta-learning- and NC-based FSCIL method, MetaNC-FSCIL, to compute the optimal margin adaptively and maintain it at each incremental session. Specifically, we first compute the theoretically optimal margin based on the NC theory. Then we introduce a novel loss function to ensure that the loss value is minimized precisely when the inter-class margin reaches its theoretically best. Motivated by the intuition that “learn how to preserve the margin” matches the meta-learning's goal of “learn how to learn”, we embed the loss function in base-session meta-training to preserve the margin for future meta-testing sessions. Experimental results demonstrate the effectiveness of MetaNC-FSCIL, achieving superior performance on multiple datasets. The code is available at https://github.com/qihangran/metaNC-FSCIL. © 2024 The Author(s)
Ämnesord och genrebeteckningar
Biuppslag (personer, institutioner, konferenser, titlar ...)
-
Li, WeijunChinese Academy Of Sciences, Beijing, China; University Of Chinese Academy Of Sciences, Beijing, China
(författare)
-
Li, LusiOld Dominion University, Norfolk, United States
(författare)
-
Tian, SongsongChinese Academy Of Sciences, Beijing, China; University Of Chinese Academy Of Sciences, Beijing, China
(författare)
-
Ning, XinChinese Academy Of Sciences, Beijing, China; University Of Chinese Academy Of Sciences, Beijing, China; Cognitive Computing Technology Joint Laboratory, Beijing, China
(författare)
-
Tiwari, Prayag,1991-Högskolan i Halmstad,Akademin för informationsteknologi(Swepub:hh)pratiw
(författare)
-
Chinese Academy Of Sciences, Beijing, China; University Of Chinese Academy Of Sciences, Beijing, ChinaOld Dominion University, Norfolk, United States
(creator_code:org_t)
Sammanhörande titlar
-
Ingår i:Information Processing & ManagementLondon : Elsevier61:30306-45731873-5371
Internetlänk
Hitta via bibliotek
Till lärosätets databas