SwePub
Sök i SwePub databas

  Utökad sökning

Träfflista för sökning "LAR1:hb ;lar1:(his);srt2:(2008);pers:(Johansson Ulf)"

Sökning: LAR1:hb > Högskolan i Skövde > (2008) > Johansson Ulf

  • Resultat 1-6 av 6
Sortera/gruppera träfflistan
   
NumreringReferensOmslagsbildHitta
1.
  • Johansson, Ulf, et al. (författare)
  • Chipper : A Novel Algorithm for Concept Description
  • 2008
  • Ingår i: Frontiers in Artificial Intelligence and Applications. - : IOS Press. - 9781586038670 ; , s. 133-140
  • Konferensbidrag (refereegranskat)abstract
    • In this paper, several demands placed on concept description algorithms are identified and discussed. The most important criterion is the ability to produce compact rule sets that, in a natural and accurate way, describe the most important relationships in the underlying domain. An algorithm based on the identified criteria is presented and evaluated. The algorithm, named Chipper, produces decision lists, where each rule covers a maximum number of remaining instances while meeting requested accuracy requirements. In the experiments, Chipper is evaluated on nine UCI data sets. The main result is that Chipper produces compact and understandable rule sets, clearly fulfilling the overall goal of concept description. In the experiments, Chipper's accuracy is similar to standard decision tree and rule induction algorithms, while rule sets have superior comprehensibility.
  •  
2.
  • Johansson, Ulf, et al. (författare)
  • Extending Nearest Neighbor Classification with Spheres of Confidence
  • 2008
  • Ingår i: Proceedings of the Twenty-First International FLAIRS Conference (FLAIRS 2008). - : AAAI Press. - 9781577353652 ; , s. 282-287
  • Konferensbidrag (refereegranskat)abstract
    • The standard kNN algorithm suffers from two major drawbacks: sensitivity to the parameter value k, i.e., the number of neighbors, and the use of k as a global constant that is independent of the particular region in which theexample to be classified falls. Methods using weighted voting schemes only partly alleviate these problems, since they still involve choosing a fixed k. In this paper, a novel instance-based learner is introduced that does not require kas a parameter, but instead employs a flexible strategy for determining the number of neighbors to consider for the specific example to be classified, hence using a local instead of global k. A number of variants of the algorithm are evaluated on 18 datasets from the UCI repository. The novel algorithm in its basic form is shown to significantly outperform standard kNN with respect to accuracy, and an adapted version of the algorithm is shown to be clearlyahead with respect to the area under ROC curve. Similar to standard kNN, the novel algorithm still allows for various extensions, such as weighted voting and axes scaling.
  •  
3.
  • Johansson, Ulf, et al. (författare)
  • Increasing Rule Extraction Accuracy by Post-processing GP Trees
  • 2008
  • Ingår i: Proceedings of the Congress on Evolutionary Computation. - : IEEE. - 9781424418237 - 9781424418220 ; , s. 3010-3015
  • Konferensbidrag (refereegranskat)abstract
    • Genetic programming (GP), is a very general and efficient technique, often capable of outperforming more specialized techniques on a variety of tasks. In this paper, we suggest a straightforward novel algorithm for post-processing of GP classification trees. The algorithm iteratively, one node at a time, searches for possible modifications that would result in higher accuracy. More specifically, the algorithm for each split evaluates every possible constant value and chooses the best. With this design, the post-processing algorithm can only increase training accuracy, never decrease it. In this study, we apply the suggested algorithm to GP trees, extracted from neural network ensembles. Experimentation, using 22 UCI datasets, shows that the post-processing results in higher test set accuracies on a large majority of datasets. As a matter of fact, for two setups of three evaluated, the increase in accuracy is statistically significant.
  •  
4.
  • König, Rikard, et al. (författare)
  • Using Genetic Programming to Increase Rule Quality
  • 2008
  • Ingår i: Proceedings of the Twenty-First International FLAIRS Conference (FLAIRS 2008). - : AAAI Press. - 9781577353652 ; , s. 288-293
  • Konferensbidrag (refereegranskat)abstract
    • Rule extraction is a technique aimed at transforming highly accurate opaque models like neural networks into comprehensible models without losing accuracy. G-REX is a rule extraction technique based on Genetic Programming that previously has performed well in several studies. This study has two objectives, to evaluate two new fitness functions for G-REX and to show how G-REX can be used as a rule inducer. The fitness functions are designed to optimize two alternative quality measures, area under ROC curves and a new comprehensibility measure called brevity. Rules with good brevity classifies typical instances with few and simple tests and use complex conditions only for atypical examples. Experiments using thirteen publicly available data sets show that the two novel fitness functions succeeded in increasing brevity and area under the ROC curve without sacrificing accuracy. When compared to a standard decision tree algorithm, G-REX achieved slightly higher accuracy, but also added additional quality to the rules by increasing their AUC or brevity significantly.
  •  
5.
  • Löfström, Tuve, et al. (författare)
  • On the Use of Accuracy and Diversity Measures for Evaluating and Selecting Ensembles of Classifiers
  • 2008
  • Ingår i: 2008 Seventh International Conference on Machine Learning and Applications. - : IEEE. - 9780769534954 ; , s. 127-132
  • Konferensbidrag (refereegranskat)abstract
    • The test set accuracy for ensembles of classifiers selected based on single measures of accuracy and diversity as well as combinations of such measures is investigated. It is found that by combining measures, a higher test set accuracy may be obtained than by using any single accuracy or diversity measure. It is further investigated whether a multi-criteria search for an ensemble that maximizes both accuracy and diversity leads to more accurate ensembles than by optimizing a single criterion. The results indicate that it might be more beneficial to search for ensembles that are both accurate and diverse. Furthermore, the results show that diversity measures could compete with accuracy measures as selection criterion.
  •  
6.
  • Löfström, Tuve, et al. (författare)
  • The Problem with Ranking Ensembles Based on Training or Validation Performance
  • 2008
  • Ingår i: Proceedings of the International Joint Conference on Neural Networks. - : IEEE. - 9781424418213 - 9781424418206
  • Konferensbidrag (refereegranskat)abstract
    • The main purpose of this study was to determine whether it is possible to somehow use results on training or validation data to estimate ensemble performance on novel data. With the specific setup evaluated; i.e. using ensembles built from a pool of independently trained neural networks and targeting diversity only implicitly, the answer is a resounding no. Experimentation, using 13 UCI datasets, shows that there is in general nothing to gain in performance on novel data by choosing an ensemble based on any of the training measures evaluated here. This is despite the fact that the measures evaluated include all the most frequently used; i.e. ensemble training and validation accuracy, base classifier training and validation accuracy, ensemble training and validation AUC and two diversity measures. The main reason is that all ensembles tend to have quite similar performance, unless we deliberately lower the accuracy of the base classifiers. The key consequence is, of course, that a data miner can do no better than picking an ensemble at random. In addition, the results indicate that it is futile to look for an algorithm aimed at optimizing ensemble performance by somehow selecting a subset of available base classifiers.
  •  
Skapa referenser, mejla, bekava och länka
  • Resultat 1-6 av 6
Typ av publikation
konferensbidrag (6)
Typ av innehåll
refereegranskat (6)
Författare/redaktör
Boström, Henrik (4)
Löfström, Tuve (4)
König, Rikard (3)
Niklasson, Lars (2)
Sönströd, Cecilia (1)
Lärosäte
Högskolan i Borås (6)
Kungliga Tekniska Högskolan (4)
Jönköping University (4)
Stockholms universitet (2)
Språk
Engelska (6)
Forskningsämne (UKÄ/SCB)
Naturvetenskap (6)
År

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Stäng

Kopiera och spara länken för att återkomma till aktuell vy