SwePub
Sök i LIBRIS databas

  Extended search

id:"swepub:oai:DiVA.org:liu-179906"
 

Search: id:"swepub:oai:DiVA.org:liu-179906" > Orthogonal Projecti...

  • 1 of 1
  • Previous record
  • Next record
  •    To hitlist

Orthogonal Projection Loss

Ranasinghe, Kanchana (author)
Mohamed bin Zayed University of AI, UAE,Mohamed Bin Zayed Univ AI, U Arab Emirates; SUNY Stony Brook, NY USA
Naseer, Muzammal (author)
Australian National University, Australia,Mohamed Bin Zayed Univ AI, U Arab Emirates; Australian Natl Univ, Australia
Hayat, Munawar (author)
Monash University, Australia,Monash Univ, Australia
show more...
Khan, Salman (author)
Mohamed bin Zayed University of AI, UAE, Australian National University, Australia,Mohamed Bin Zayed Univ AI, U Arab Emirates; Australian Natl Univ, Australia
Khan, Fahad Shahbaz, 1983- (author)
Linköpings universitet,Datorseende,Tekniska fakulteten,Mohamed bin Zayed University of AI, UAE,Mohamed Bin Zayed Univ AI, U Arab Emirates
show less...
 (creator_code:org_t)
IEEE, 2021
English.
Series: arXiv.org ; 2103.14021
In: 2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021). - : IEEE. - 9781665428125 ; , s. 12313-12323
  • Other publication (other academic/artistic)
Abstract Subject headings
Close  
  • Deep neural networks have achieved remarkable performance on a range of classification tasks, with softmax cross-entropy (CE) loss emerging as the de-facto objective function. The CE loss encourages features of a class to have a higher projection score on the true class-vector compared to the negative classes. However, this is a relative constraint and does not explicitly force different class features to be well-separated. Motivated by the observation that ground-truth class representations in CE loss are orthogonal (one-hot encoded vectors), we develop a novel loss function termed `Orthogonal Projection Loss' (OPL) which imposes orthogonality in the feature space. OPL augments the properties of CE loss and directly enforces inter-class separation alongside intra-class clustering in the feature space through orthogonality constraints on the mini-batch level. As compared to other alternatives of CE, OPL offers unique advantages e.g., no additional learnable parameters, does not require careful negative mining and is not sensitive to the batch size. Given the plug-and-play nature of OPL, we evaluate it on a diverse range of tasks including image recognition (CIFAR-100), large-scale classification (ImageNet), domain generalization (PACS) and few-shot learning (miniImageNet, CIFAR-FS, tiered-ImageNet and Meta-dataset) and demonstrate its effectiveness across the board. Furthermore, OPL offers better robustness against practical nuisances such as adversarial attacks and label noise. 

Subject headings

NATURVETENSKAP  -- Data- och informationsvetenskap -- Datorseende och robotik (hsv//swe)
NATURAL SCIENCES  -- Computer and Information Sciences -- Computer Vision and Robotics (hsv//eng)

Publication and Content Type

vet (subject category)
ovr (subject category)

Find in a library

To the university's database

  • 1 of 1
  • Previous record
  • Next record
  •    To hitlist

Find more in SwePub

By the author/editor
Ranasinghe, Kanc ...
Naseer, Muzammal
Hayat, Munawar
Khan, Salman
Khan, Fahad Shah ...
About the subject
NATURAL SCIENCES
NATURAL SCIENCES
and Computer and Inf ...
and Computer Vision ...
Parts in the series
Articles in the publication
2021 IEEE/CVF IN ...
By the university
Linköping University

Search outside SwePub

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Close

Copy and save the link in order to return to this view