SwePub
Sök i SwePub databas

  Utökad sökning

Träfflista för sökning "WFRF:(Ekdahl Magnus 1979 ) "

Sökning: WFRF:(Ekdahl Magnus 1979 )

  • Resultat 1-5 av 5
Sortera/gruppera träfflistan
   
NumreringReferensOmslagsbildHitta
1.
  • Ekdahl, Magnus, 1979- (författare)
  • Approximations of Bayes Classifiers for Statistical Learning of Clusters
  • 2006
  • Licentiatavhandling (övrigt vetenskapligt/konstnärligt)abstract
    • It is rarely possible to use an optimal classifier. Often the classifier used for a specific problem is an approximation of the optimal classifier. Methods are presented for evaluating the performance of an approximation in the model class of Bayesian Networks. Specifically for the approximation of class conditional independence a bound for the performance is sharpened.The class conditional independence approximation is connected to the minimum description length principle (MDL), which is connected to Jeffreys’ prior through commonly used assumptions. One algorithm for unsupervised classification is presented and compared against other unsupervised classifiers on three data sets.
  •  
2.
  • Ekdahl, Magnus, 1979- (författare)
  • On approximations and computations in probabilistic classification and in learning of graphical models
  • 2007
  • Doktorsavhandling (övrigt vetenskapligt/konstnärligt)abstract
    • Model based probabilistic classification is heavily used in data mining and machine learning. For computational learning these models may need approximation steps however. One popular approximation in classification is to model the class conditional densities by factorization, which in the independence case is usually called the ’Naïve Bayes’ classifier. In general probabilistic independence cannot model all distributions exactly, and not much has been published on how much a discrete distribution can differ from the independence assumption. In this dissertation the approximation quality of factorizations is analyzed in two articles.A specific class of factorizations is the factorizations represented by graphical models. Several challenges arise from the use of statistical methods for learning graphical models from data. Examples of problems include the increase in the number of graphical model structures as a function of the number of nodes, and the equivalence of statistical models determined by different graphical models. In one article an algorithm for learning graphical models is presented. In the final article an algorithm for clustering parts of DNA strings is developed, and a graphical representation for the remaining DNA part is learned.
  •  
3.
  • Ekdahl, Magnus, 1979-, et al. (författare)
  • On Concentration of Discrete Distributions with Applications to Supervised Learning of Classifiers
  • 2007
  • Ingår i: Machine Learning and Data Mining in Pattern Recognition. - Berlin, Heidelberg : Springer Berlin/Heidelberg. - 9783540734987 - 9783540734994 - 3540734988 ; , s. 2-16
  • Bokkapitel (refereegranskat)abstract
    • Computational procedures using independence assumptions in various forms are popular in machine learning, although checks on empirical data have given inconclusive results about their impact. Some theoretical understanding of when they work is available, but a definite answer seems to be lacking. This paper derives distributions that maximizes the statewise difference to the respective product of marginals. These distributions are, in a sense the worst distribution for predicting an outcome of the data generating mechanism by independence. We also restrict the scope of new theoretical results by showing explicitly that, depending on context, independent ('Naïve') classifiers can be as bad as tossing coins. Regardless of this, independence may beat the generating model in learning supervised classification and we explicitly provide one such scenario.
  •  
4.
  • Ekdahl, Magnus, 1979-, et al. (författare)
  • On the Performance of Approximations of Bayesian Networks in Model-
  • 2006
  • Ingår i: The Annual Workshop of the Swedish Artificial Intelligence Society,2006. - Umeå : SAIS. ; , s. 73-
  • Konferensbidrag (refereegranskat)abstract
    • When the true class conditional model and class probabilities are approximated in a pattern recognition/classification problem the performance of the optimal classifier is expected to deteriorate. But calculating this reduction is far from trivial in the general case. We present one generalization, and easily computable formulas for estimating the degradation in performance with respect to the optimal classifier. An example of an approximation is the Naive Bayes classifier. We generalize and sharpen results for evaluating this classifier.
  •  
5.
  •  
Skapa referenser, mejla, bekava och länka
  • Resultat 1-5 av 5

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Stäng

Kopiera och spara länken för att återkomma till aktuell vy