SwePub
Sök i SwePub databas

  Utökad sökning

Träfflista för sökning "WFRF:(Frouin Vincent) ;mspu:(conferencepaper)"

Sökning: WFRF:(Frouin Vincent) > Konferensbidrag

  • Resultat 1-2 av 2
Sortera/gruppera träfflistan
   
NumreringReferensOmslagsbildHitta
1.
  • Löfstedt, Tommy, et al. (författare)
  • Structured Variable Selection for Regularized Generalized Canonical Correlation Analysis
  • 2016
  • Ingår i: MULTIPLE FACETS OF PARTIAL LEAST SQUARES AND RELATED METHODS. - Cham : SPRINGER INT PUBLISHING AG. - 9783319406435 - 9783319406411 ; , s. 129-139
  • Konferensbidrag (refereegranskat)abstract
    • Regularized Generalized Canonical Correlation Analysis (RGCCA) extends regularized canonical correlation analysis to more than two sets of variables. Sparse GCCA(SGCCA) was recently proposed to address the issue of variable selection. However, the variable selection scheme offered by SGCCA is limited to the covariance (tau = 1) link between blocks. In this paper we go beyond the covariance link by proposing an extension of SGCCA for the full RGCCA model. (tau epsilon [0; 1]). In addition, we also propose an extension of SGCCA that exploits pre-given structural relationships between variables within blocks. Specifically, we propose an algorithm that allows structured and sparsity-inducing penalties to be included in the RGCCA optimization problem.
  •  
2.
  • Dubois, Mathieu, et al. (författare)
  • Predictive support recovery with TV-Elastic Net penalty and logistic regression: An application to structural MRI
  • 2014
  • Konferensbidrag (refereegranskat)abstract
    • The use of machine-learning in neuroimaging offers new perspectives in early diagnosis and prognosis of brain diseases. Although such multivariate methods can capture complex relationships in the data, traditional approaches provide irregular (l12 penalty) or scattered (l1 penalty) predictive pattern with a very limited relevance. A penalty like Total Variation (TV) that exploits the natural 3D structure of the images can increase the spatial coherence of the weight map. However, TV penalization leads to non-smooth optimization problems that are hard to minimize. We propose an optimization framework that minimizes any combination of l1, l2, and TV penalties while preserving the exact l1 penalty. This algorithm uses Nesterov's smoothing technique to approximate the TV penalty with a smooth function such that the loss and the penalties are minimized with an exact accelerated proximal gradient algorithm. We propose an original continuation algorithm that uses successively smaller values of the smoothing parameter to reach a prescribed precision while achieving the best possible convergence rate. This algorithm can be used with other losses or penalties. The algorithm is applied on a classification problem on the ADNI dataset. We observe that the TV penalty does not necessarily improve the prediction but provides a major breakthrough in terms of support recovery of the predictive brain regions.
  •  
Skapa referenser, mejla, bekava och länka
  • Resultat 1-2 av 2

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Stäng

Kopiera och spara länken för att återkomma till aktuell vy