SwePub
Sök i LIBRIS databas

  Extended search

onr:"swepub:oai:research.chalmers.se:b8fe129c-b132-4a6e-8b10-48ce7fc02805"
 

Search: onr:"swepub:oai:research.chalmers.se:b8fe129c-b132-4a6e-8b10-48ce7fc02805" > Generalization Boun...

  • 1 of 1
  • Previous record
  • Next record
  •    To hitlist

Generalization Bounds via Information Density and Conditional Information Density

Hellström, Fredrik, 1993 (author)
Chalmers tekniska högskola,Chalmers University of Technology
Durisi, Giuseppe, 1977 (author)
Chalmers tekniska högskola,Chalmers University of Technology
 (creator_code:org_t)
2020
2020
English.
In: IEEE Journal on Selected Areas in Information Theory. - 2641-8770. ; 1:3, s. 824-839
  • Journal article (peer-reviewed)
Abstract Subject headings
Close  
  • We present a general approach, based on an exponential inequality, to derive bounds on the generalization error of randomized learning algorithms. Using this approach, we provide bounds on the average generalization error as well as bounds on its tail probability, for both the PAC-Bayesian and single-draw scenarios. Specifically, for the case of sub-Gaussian loss functions, we obtain novel bounds that depend on the information density between the training data and the output hypothesis. When suitably weakened, these bounds recover many of the information-theoretic bounds available in the literature. We also extend the proposed exponential-inequality approach to the setting recently introduced by Steinke and Zakynthinou (2020), where the learning algorithm depends on a randomly selected subset of the available training data. For this setup, we present bounds for bounded loss functions in terms of the conditional information density between the output hypothesis and the random variable determining the subset choice, given all training data. Through our approach, we recover the average generalization bound presented by Steinke and Zakynthinou (2020) and extend it to the PAC-Bayesian and singledraw scenarios. For the single-draw scenario, we also obtain novel bounds in terms of the conditional α-mutual information and the conditional maximal leakage.

Subject headings

NATURVETENSKAP  -- Matematik -- Beräkningsmatematik (hsv//swe)
NATURAL SCIENCES  -- Mathematics -- Computational Mathematics (hsv//eng)
NATURVETENSKAP  -- Matematik -- Sannolikhetsteori och statistik (hsv//swe)
NATURAL SCIENCES  -- Mathematics -- Probability Theory and Statistics (hsv//eng)
NATURVETENSKAP  -- Matematik -- Diskret matematik (hsv//swe)
NATURAL SCIENCES  -- Mathematics -- Discrete Mathematics (hsv//eng)
NATURVETENSKAP  -- Matematik -- Matematisk analys (hsv//swe)
NATURAL SCIENCES  -- Mathematics -- Mathematical Analysis (hsv//eng)

Keyword

PAC-Bayes
Information Theory
Statistical Learning Theory

Publication and Content Type

art (subject category)
ref (subject category)

Find in a library

To the university's database

  • 1 of 1
  • Previous record
  • Next record
  •    To hitlist

Find more in SwePub

By the author/editor
Hellström, Fredr ...
Durisi, Giuseppe ...
About the subject
NATURAL SCIENCES
NATURAL SCIENCES
and Mathematics
and Computational Ma ...
NATURAL SCIENCES
NATURAL SCIENCES
and Mathematics
and Probability Theo ...
NATURAL SCIENCES
NATURAL SCIENCES
and Mathematics
and Discrete Mathema ...
NATURAL SCIENCES
NATURAL SCIENCES
and Mathematics
and Mathematical Ana ...
Articles in the publication
IEEE Journal on ...
By the university
Chalmers University of Technology

Search outside SwePub

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Close

Copy and save the link in order to return to this view