SwePub
Sök i SwePub databas

  Utökad sökning

Träfflista för sökning "WFRF:(Kviman Oskar) "

Sökning: WFRF:(Kviman Oskar)

  • Resultat 1-6 av 6
Sortera/gruppera träfflistan
   
NumreringReferensOmslagsbildHitta
1.
  • Koptagel, Hazal, et al. (författare)
  • VaiPhy : a Variational Inference Based Algorithm for Phylogeny
  • Annan publikation (övrigt vetenskapligt/konstnärligt)abstract
    • Phylogenetics is a classical methodology in com- putational biology that today has become highly relevant for medical investigation of single-cell data, e.g., in the context of development of can- cer. The exponential size of the tree space is unfortunately a formidable obstacle for current Bayesian phylogenetic inference using Markov chain Monte Carlo based methods since these rely on local operations. And although more re- cent variational inference (VI) based methods of- fer speed improvements, they rely on expensive auto-differentiation operations for learning the variational parameters. We propose VaiPhy, a remarkably fast VI based algorithm for approx- imate posterior inference in an augmented tree space. VaiPhy produces marginal log-likelihood estimates on par with the state-of-the-art meth- ods on real data, and is considerably faster since it does not require auto-differentiation. Instead, VaiPhy combines coordinate ascent update equa- tions with two novel sampling schemes: (i) SLANTIS, a proposal distribution for tree topolo- gies in the augmented tree space, and (ii) the JC sampler, the, to the best of our knowledge, first ever scheme for sampling branch lengths directly from the popular Jukes-Cantor model. We compare VaiPhy in terms of density esti- mation and runtime. Additionally, we evaluate the reproducibility of the baselines. We provide our code on GitHub: https://github.com/ Lagergren-Lab/VaiPhy. 
  •  
2.
  • Koptagel, Hazal, 1991-, et al. (författare)
  • VaiPhy : a Variational Inference Based Algorithm for Phylogeny
  • 2022
  • Ingår i: Proceedings Advances in Neural Information Processing Systems 35 - 36th Conference on Neural Information Processing Systems, NeurIPS 2022. - : Neural Information Processing Systems Foundation.
  • Konferensbidrag (refereegranskat)abstract
    • Phylogenetics is a classical methodology in computational biology that today has become highly relevant for medical investigation of single-cell data, e.g., in the context of cancer development. The exponential size of the tree space is, unfortunately, a substantial obstacle for Bayesian phylogenetic inference using Markov chain Monte Carlo based methods since these rely on local operations. And although more recent variational inference (VI) based methods offer speed improvements, they rely on expensive auto-differentiation operations for learning the variational parameters. We propose VaiPhy, a remarkably fast VI based algorithm for approximate posterior inference in an augmented tree space. VaiPhy produces marginal log-likelihood estimates on par with the state-of-the-art methods on real data and is considerably faster since it does not require auto-differentiation. Instead, VaiPhy combines coordinate ascent update equations with two novel sampling schemes: (i) SLANTIS, a proposal distribution for tree topologies in the augmented tree space, and (ii) the JC sampler, to the best of our knowledge, the first-ever scheme for sampling branch lengths directly from the popular Jukes-Cantor model. We compare VaiPhy in terms of density estimation and runtime. Additionally, we evaluate the reproducibility of the baselines. We provide our code on GitHub: https://github.com/Lagergren-Lab/VaiPhy.
  •  
3.
  • Kviman, Oskar, et al. (författare)
  • Cooperation in the Latent Space : The Benefits of Adding Mixture Components in Variational Autoencoders
  • 2023
  • Ingår i: Proceedings of the 40th International Conference on Machine Learning, ICML 2023. - : ML Research Press. ; , s. 18008-18022
  • Konferensbidrag (refereegranskat)abstract
    • In this paper, we show how the mixture components cooperate when they jointly adapt to maximize the ELBO. We build upon recent advances in the multiple and adaptive importance sampling literature. We then model the mixture components using separate encoder networks and show empirically that the ELBO is monotonically non-decreasing as a function of the number of mixture components. These results hold for a range of different VAE architectures on the MNIST, FashionMNIST, and CIFAR-10 datasets. In this work, we also demonstrate that increasing the number of mixture components improves the latent-representation capabilities of the VAE on both image and single-cell datasets. This cooperative behavior motivates that using Mixture VAEs should be considered a standard approach for obtaining more flexible variational approximations. Finally, Mixture VAEs are here, for the first time, compared and combined with normalizing flows, hierarchical models and/or the VampPrior in an extensive ablation study. Multiple of our Mixture VAEs achieve state-of-the-art log-likelihood results for VAE architectures on the MNIST and FashionMNIST datasets. The experiments are reproducible using our code, provided here.
  •  
4.
  • Kviman, Oskar, et al. (författare)
  • Multiple Importance Sampling ELBO and Deep Ensembles of Variational Approximations
  • 2022
  • Ingår i: International Conference on Artificial Intelligence and Statistics, Vol 151. - : ML Research Press.
  • Konferensbidrag (refereegranskat)abstract
    • In variational inference (VI), the marginal log-likelihood is estimated using the standard evidence lower bound (ELBO), or improved versions as the importance weighted ELBO (IWELBO). We propose the multiple importance sampling ELBO (MISELBO), a versatile yet simple framework. MISELBO is applicable in both amortized and classical VI, and it uses ensembles, e.g., deep ensembles, of independently inferred variational approximations. As far as we are aware, the concept of deep ensembles in amortized VI has not previously been established. We prove that MISELBO provides a tighter bound than the average of standard ELBOs, and demonstrate empirically that it gives tighter bounds than the average of IWELBOs. MISELBO is evaluated in density-estimation experiments that include MNIST and several real-data phylogenetic tree inference problems. First, on the MNIST dataset, MISELBO boosts the density-estimation performances of a state-of-the-art model, nouveau VAE. Second, in the phylogenetic tree inference setting, our framework enhances a state-of-the-art VI algorithm that uses normalizing flows. On top of the technical benefits of MISELBO, it allows to unveil connections between VI and recent advances in the importance sampling literature, paving the way for further methodological advances.
  •  
5.
  • Kviman, Oskar, et al. (författare)
  • Variational Resampling
  • 2024
  • Ingår i: Proceedings of the 27th International Conference on Artificial Intelligence and Statistics, AISTATS 2024. - : ML Research Press. ; , s. 3286-3294
  • Konferensbidrag (refereegranskat)abstract
    • We cast the resampling step in particle filters (PFs) as a variational inference problem, resulting in a new class of resampling schemes: variational resampling. Variational resampling is flexible as it allows for choices of 1) divergence to minimize, 2) target distribution to input to the divergence, and 3) divergence minimization algorithm. With this novel application of VI to particle filters, variational resampling further unifies these two powerful and popular methodologies. We construct two variational resamplers that replicate particles in order to maximize lower bounds with respect to two different target measures. We benchmark our variational resamplers on challenging smoothing tasks, outperforming PFs that implement the state-of-the-art resampling schemes.
  •  
6.
  • Martinez Mayorquin, Ramon Heberto, et al. (författare)
  • Sequence Disambiguation with Synaptic Traces in Associative Neural Networks
  • 2019
  • Ingår i: 28th International Conference on Artificial Neural Networks, ICANN 2019. - Cham : Springer Nature. ; , s. 793-805
  • Konferensbidrag (refereegranskat)abstract
    • Among the abilities that a sequence processing network should possess sequence disambiguation, that is, the ability to let temporal context information influence the evolution of the network dynamics, is one of the most important. In this work we propose an instance of the Bayesian Confidence Propagation Neural Network (BCPNN) that learns sequences with probabilistic associative learning and is able to disambiguate sequences with the use of synaptic traces (low pass filtered versions of the activity). We describe first how the BCPNN achieves both sequence recall and sequence learning from temporal input. Our main result is that the BCPNN network equipped with dynamical memory in the form of synaptic traces is capable of solving the sequence disambiguation problem in a reliable way. We characterize the relationship between the sequence disambiguation capabilities of the network and its dynamical parameters. Furthermore, we show that the inclusion of an additional fast synaptic trace greatly increases the network disambiguation capabilities.
  •  
Skapa referenser, mejla, bekava och länka
  • Resultat 1-6 av 6

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Stäng

Kopiera och spara länken för att återkomma till aktuell vy