SwePub
Sök i SwePub databas

  Extended search

Träfflista för sökning "WFRF:(Wiqvist Samuel) "

Search: WFRF:(Wiqvist Samuel)

  • Result 1-8 of 8
Sort/group result
   
EnumerationReferenceCoverFind
1.
  • Persson, Sebastian, et al. (author)
  • PEPSDI: Scalable and flexible inference framework for stochastic dynamic single-cell models
  • 2021
  • Other publication (other academic/artistic)abstract
    • Mathematical modelling is an invaluable tool to describe dynamic cellular processes and to rationalise cell-to-cell variability within the population. This requires statistical methods to infer unknown model parameters from dynamic, multi-individual data accounting for heterogeneity caused by both intrinsic and extrinsic noise. Here we present PEPSDI, a scalable and flexible framework for Bayesian inference in state-space mixed-effects stochastic dynamic single-cell models. Unlike previous frameworks, PEPSDI imposes a few modelling assumptions when inferring unknown model parameters from time-lapse data. Specifically, it can infer model parameters when intrinsic noise is modelled by either exact or approximate stochastic simulators, and when extrinsic noise is modelled by either time-varying, or time-constant parameters that vary between cells. This allowed us to identify hexokinase activity as a source of extrinsic noise, and to deduce that sugar availability dictates cell-to-cell variability in the budding yeast Saccharomyces cerevisiae SNF1 nutrient sensing pathway.
  •  
2.
  • Persson, Sebastian, 1996, et al. (author)
  • Scalable and flexible inference framework for stochastic dynamic single-cell models
  • 2022
  • In: PLoS Computational Biology. - : Public Library of Science (PLoS). - 1553-734X .- 1553-7358. ; 18
  • Journal article (peer-reviewed)abstract
    • Understanding the inherited nature of how biological processes dynamically change over time and exhibit intra- and inter-individual variability, due to the different responses to environmental stimuli and when interacting with other processes, has been a major focus of systems biology. The rise of single-cell fluorescent microscopy has enabled the study of those phenomena. The analysis of single-cell data with mechanistic models offers an invaluable tool to describe dynamic cellular processes and to rationalise cell-to-cell variability within the population. However, extracting mechanistic information from single-cell data has proven difficult. This requires statistical methods to infer unknown model parameters from dynamic, multi-individual data accounting for heterogeneity caused by both intrinsic (e.g. variations in chemical reactions) and extrinsic (e.g. variability in protein concentrations) noise. Although several inference methods exist, the availability of efficient, general and accessible methods that facilitate modelling of single-cell data, remains lacking. Here we present a scalable and flexible framework for Bayesian inference in state-space mixed-effects single-cell models with stochastic dynamic. Our approach infers model parameters when intrinsic noise is modelled by either exact or approximate stochastic simulators, and when extrinsic noise is modelled by either time-varying, or time-constant parameters that vary between cells. We demonstrate the relevance of our approach by studying how cell-to-cell variation in carbon source utilisation affects heterogeneity in the budding yeast Saccharomyces cerevisiae SNF1 nutrient sensing pathway. We identify hexokinase activity as a source of extrinsic noise and deduce that sugar availability dictates cell-to-cell variability.
  •  
3.
  • Picchini, Umberto, 1977, et al. (author)
  • Accelerating delayed-acceptance Markov chain Monte Carlo algorithms
  • 2019
  • Journal article (other academic/artistic)abstract
    • Delayed-acceptance Markov chain Monte Carlo (DA-MCMC) samples from a probability distribution via a two-stages version of the Metropolis-Hastings algorithm, by combining the target distribution with a "surrogate" (i.e. an approximate and computationally cheaper version) of said distribution. DA-MCMC accelerates MCMC sampling in complex applications, while still targeting the exact distribution. We design a computationally faster, albeit approximate, DA-MCMC algorithm. We consider parameter inference in a Bayesian setting where a surrogate likelihood function is introduced in the delayed-acceptance scheme. When the evaluation of the likelihood function is computationally intensive, our scheme produces a 2-4 times speed-up, compared to standard DA-MCMC. However, the acceleration is highly problem dependent. Inference results for the standard delayed-acceptance algorithm and our approximated version are similar, indicating that our algorithm can return reliable Bayesian inference. As a computationally intensive case study, we introduce a novel stochastic differential equation model for protein folding data.
  •  
4.
  •  
5.
  • Wiqvist, Samuel, et al. (author)
  • Efficient inference for stochastic differential equation mixed-effects models using correlated particle pseudo-marginal algorithms
  • 2021
  • In: Computational Statistics and Data Analysis. - : Elsevier BV. - 0167-9473. ; 157
  • Journal article (peer-reviewed)abstract
    • Stochastic differential equation mixed-effects models (SDEMEMs) are flexible hierarchical models that are able to account for random variability inherent in the underlying time-dynamics, as well as the variability between experimental units and, optionally, account for measurement error. Fully Bayesian inference for state-space SDEMEMs is performed, using data at discrete times that may be incomplete and subject to measurement error. However, the inference problem is complicated by the typical intractability of the observed data likelihood which motivates the use of sampling-based approaches such as Markov chain Monte Carlo. A Gibbs sampler is proposed to target the marginal posterior of all parameter values of interest. The algorithm is made computationally efficient through careful use of blocking strategies and correlated pseudo-marginal Metropolis–Hastings steps within the Gibbs scheme. The resulting methodology is flexible and is able to deal with a large class of SDEMEMs. The methodology is demonstrated on three case studies, including tumor growth dynamics and neuronal data. The gains in terms of increased computational efficiency are model and data dependent, but unless bespoke sampling strategies requiring analytical derivations are possible for a given model, we generally observe an efficiency increase of one order of magnitude when using correlated particle methods together with our blocked-Gibbs strategy.
  •  
6.
  • Wiqvist, Samuel, et al. (author)
  • Partially Exchangeable Networks and architectures for learning summary statistics in Approximate Bayesian Computation
  • 2019
  • In: Proceedings of the 36th International Conference on Machine Learning. - : PMLR. ; 2019-June, s. 11795-11804
  • Conference paper (peer-reviewed)abstract
    • We present a novel family of deep neural architectures, named partially exchangeable networks (PENs) that leverage probabilistic symmetries. By design, PENs are invariant to block-switch transformations, which characterize the partial exchangeability properties of conditionally Markovian processes. Moreover, we show that any block-switch invariant function has a PEN-like representation. The DeepSets architecture is a special case of PEN and we can therefore also target fully exchangeable data. We employ PENs to learn summary statistics in approximate Bayesian computation (ABC). When comparing PENs to previous deep learning methods for learning summary statistics, our results are highly competitive, both considering time series and static models. Indeed, PENs provide more reliable posterior samples even when using less training data.
  •  
7.
  • Wiqvist, Samuel, et al. (author)
  • Sequential neural posterior and likelihood approximation
  • 2021
  • Other publication (other academic/artistic)abstract
    • We introduce the sequential neural posterior and likelihood approximation (SNPLA) algorithm. SNPLA is a normalizing flows-based algorithm for inference in implicit models, and therefore is a simulation-based inference method that only requires simulations from a generative model. SNPLA avoids Markov chain Monte Carlo sampling and correction-steps of the parameter proposal function that are introduced in similar methods, but that can be numerically unstable or restrictive. By utilizing the reverse KL divergence, SNPLA manages to learn both the likelihood and the posterior in a sequential manner. Over four experiments, we show that SNPLA performs competitively when utilizing the same number of model simulations as used in other methods, even though the inference problem for SNPLA is more complex due to the joint learning of posterior and likelihood function. Due to utilizing normalizing flows SNPLA generates posterior draws much faster (4 orders of magnitude) than MCMC-based methods.
  •  
8.
  • Wiqvist, Samuel (author)
  • Simulation-based Inference : From Approximate Bayesian Computation and Particle Methods to Neural Density Estimation
  • 2021
  • Doctoral thesis (other academic/artistic)abstract
    • This doctoral thesis in computational statistics utilizes both Monte Carlo methods(approximate Bayesian computation and sequential Monte Carlo) and machine­-learning methods (deep learning and normalizing flows) to develop novel algorithms for infer­ence in implicit Bayesian models. Implicit models are those for which calculating the likelihood function is very challenging (and often impossible), but model simulation is feasible. The inference methods developed in the thesis are simulation­-based infer­ence methods since they leverage the possibility to simulate data from the implicit models. Several approaches are considered in the thesis: Paper II and IV focus on classical methods (sequential Monte Carlo­-based methods), while paper I and III fo­cus on more recent machine learning methods (deep learning and normalizing flows, respectively).Paper I constructs novel deep learning methods for learning summary statistics for approximate Bayesian computation (ABC). To achieve this paper I introduces the partially exchangeable network (PEN), a deep learning architecture specifically de­signed for Markovian data (i.e., partially exchangeable data).Paper II considers Bayesian inference in stochastic differential equation mixed-effects models (SDEMEM). Bayesian inference for SDEMEMs is challenging due to the intractable likelihood function of SDEMEMs. Paper II addresses this problem by designing a novel a Gibbs­-blocking strategy in combination with correlated pseudo­ marginal methods. The paper also discusses how custom particle filters can be adapted to the inference procedure.Paper III introduces the novel inference method sequential neural posterior and like­lihood approximation (SNPLA). SNPLA is a simulation­-based inference algorithm that utilizes normalizing flows for learning both the posterior distribution and the likelihood function of an implicit model via a sequential scheme. By learning both the likelihood and the posterior, and by leveraging the reverse Kullback Leibler (KL) divergence, SNPLA avoids ad­-hoc correction steps and Markov chain Monte Carlo (MCMC) sampling.Paper IV introduces the accelerated-delayed acceptance (ADA) algorithm. ADA can be viewed as an extension of the delayed­-acceptance (DA) MCMC algorithm that leverages connections between the two likelihood ratios of DA to further accelerate MCMC sampling from the posterior distribution of interest, although our approach introduces an approximation. The main case study of paper IV is a double­-well po­tential stochastic differential equation (DWP­SDE) model for protein-­folding data (reaction coordinate data).
  •  
Skapa referenser, mejla, bekava och länka
  • Result 1-8 of 8

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Close

Copy and save the link in order to return to this view