SwePub
Sök i LIBRIS databas

  Extended search

onr:"swepub:oai:lup.lub.lu.se:11f67d79-fc71-4448-9d5e-69e4edfef896"
 

Search: onr:"swepub:oai:lup.lub.lu.se:11f67d79-fc71-4448-9d5e-69e4edfef896" > The Strong Screenin...

  • 1 of 1
  • Previous record
  • Next record
  •    To hitlist

The Strong Screening Rule For SLOPE

Larsson, Johan (author)
Lund University,Lunds universitet,Statistiska institutionen,Ekonomihögskolan,Department of Statistics,Lund University School of Economics and Management, LUSEM
Bogdan, Malgorzata (author)
Lund University,Lunds universitet,Statistiska institutionen,Ekonomihögskolan,Department of Statistics,Lund University School of Economics and Management, LUSEM,Wroclaw University
Wallin, Jonas (author)
Lund University,Lunds universitet,Statistiska institutionen,Ekonomihögskolan,Department of Statistics,Lund University School of Economics and Management, LUSEM
 (creator_code:org_t)
2020
2020
English 12 s.
In: Advances in Neural Information Processing Systems. - 1049-5258. ; , s. 1-12
  • Journal article (peer-reviewed)
Abstract Subject headings
Close  
  • Extracting relevant features from data sets where the number of observations n is much smaller then the number of predictors p is a major challenge in modern statistics. Sorted L-One Penalized Estimation (SLOPE)—a generalization of the lasso---is a promising method within this setting. Current numerical procedures for SLOPE, however, lack the efficiency that respective tools for the lasso enjoy, particularly in the context of estimating a complete regularization path. A key component in the efficiency of the lasso is predictor screening rules: rules that allow predictors to be discarded before estimating the model. This is the first paper to establish such a rule for SLOPE. We develop a screening rule for SLOPE by examining its subdifferential and show that this rule is a generalization of the strong rule for the lasso. Our rule is heuristic, which means that it may discard predictors erroneously. In our paper, however, we show that such situations are rare and easily safeguarded against by a simple check of the optimality conditions. Our numerical experiments show that the rule performs well in practice, leading to improvements by orders of magnitude for data in the p >> n domain, as well as incurring no additional computational overhead when n > p.

Subject headings

NATURVETENSKAP  -- Matematik -- Sannolikhetsteori och statistik (hsv//swe)
NATURAL SCIENCES  -- Mathematics -- Probability Theory and Statistics (hsv//eng)
NATURVETENSKAP  -- Matematik -- Beräkningsmatematik (hsv//swe)
NATURAL SCIENCES  -- Mathematics -- Computational Mathematics (hsv//eng)

Keyword

screening rules
lasso
regression
regularization

Publication and Content Type

art (subject category)
ref (subject category)

Find in a library

To the university's database

  • 1 of 1
  • Previous record
  • Next record
  •    To hitlist

Find more in SwePub

By the author/editor
Larsson, Johan
Bogdan, Malgorza ...
Wallin, Jonas
About the subject
NATURAL SCIENCES
NATURAL SCIENCES
and Mathematics
and Probability Theo ...
NATURAL SCIENCES
NATURAL SCIENCES
and Mathematics
and Computational Ma ...
Articles in the publication
Advances in Neur ...
By the university
Lund University

Search outside SwePub

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Close

Copy and save the link in order to return to this view