SwePub
Sök i LIBRIS databas

  Utökad sökning

WFRF:(Dang Tran Ngoc)
 

Sökning: WFRF:(Dang Tran Ngoc) > Hamiltonian Monte C...

Hamiltonian Monte Carlo with Energy Conserving Subsampling

Dang, Khue-Dung (författare)
Univ New South Wales, Australia; ARC Ctr Excellence Math and Stat Frontiers ACEMS, Australia
Quiroz, Matias (författare)
ARC Ctr Excellence Math and Stat Frontiers ACEMS, Australia; Univ Technol Sydney, Australia
Kohn, Robert (författare)
Univ New South Wales, Australia; ARC Ctr Excellence Math and Stat Frontiers ACEMS, Australia
visa fler...
Minh-Ngoc, Tran (författare)
ARC Ctr Excellence Math and Stat Frontiers ACEMS, Australia; Univ Sydney, Australia
Villani, Mattias (författare)
Linköpings universitet,Stockholms universitet,Statistiska institutionen,Linköping University, Sweden; ARC Centre of Excellence for Mathematical and Statistical Frontiers (ACEMS), Australia,Statistik och maskininlärning,Filosofiska fakulteten,ARC Ctr Excellence Math and Stat Frontiers ACEMS, Australia; Stockholm Univ, Sweden
visa färre...
 (creator_code:org_t)
MIT Press, 2019
2019
Engelska.
Ingår i: Journal of machine learning research. - : MIT Press. - 1532-4435 .- 1533-7928. ; 20, s. 1-31
  • Tidskriftsartikel (refereegranskat)
Abstract Ämnesord
Stäng  
  • Hamiltonian Monte Carlo (HMC) samples efficiently from high-dimensional posterior distributions with proposed parameter draws obtained by iterating on a discretized version of the Hamiltonian dynamics. The iterations make HMC computationally costly, especially in problems with large data sets, since it is necessary to compute posterior densities and their derivatives with respect to the parameters. Naively computing the Hamiltonian dynamics on a subset of the data causes HMC to lose its key ability to generate distant parameter proposals with high acceptance probability. The key insight in our article is that efficient subsampling HMC for the parameters is possible if both the dynamics and the acceptance probability are computed from the same data subsample in each complete HMC iteration. We show that this is possible to do in a principled way in a HMC-within-Gibbs framework where the subsample is updated using a pseudo marginal MH step and the parameters are then updated using an HMC step, based on the current subsample. We show that our subsampling methods are fast and compare favorably to two popular sampling algorithms that use gradient estimates from data subsampling. We also explore the current limitations of subsampling HMC algorithms by varying the quality of the variance reducing control variates used in the estimators of the posterior density and its gradients.

Ämnesord

TEKNIK OCH TEKNOLOGIER  -- Elektroteknik och elektronik (hsv//swe)
ENGINEERING AND TECHNOLOGY  -- Electrical Engineering, Electronic Engineering, Information Engineering (hsv//eng)
NATURVETENSKAP  -- Data- och informationsvetenskap (hsv//swe)
NATURAL SCIENCES  -- Computer and Information Sciences (hsv//eng)
NATURVETENSKAP  -- Matematik -- Sannolikhetsteori och statistik (hsv//swe)
NATURAL SCIENCES  -- Mathematics -- Probability Theory and Statistics (hsv//eng)

Nyckelord

Bayesian inference
Big Data
Markov chain Monte Carlo
Estimated likelihood
Stochastic gradient Hamiltonian Monte Carlo
Stochastic Gradient Langevin Dynamics

Publikations- och innehållstyp

ref (ämneskategori)
art (ämneskategori)

Hitta via bibliotek

Till lärosätets databas

Sök utanför SwePub

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Stäng

Kopiera och spara länken för att återkomma till aktuell vy