SwePub
Tyck till om SwePub Sök här!
Sök i SwePub databas

  Utökad sökning

Träfflista för sökning "WFRF:(Martin NG) ;conttype:(scientificother)"

Sökning: WFRF:(Martin NG) > Övrigt vetenskapligt/konstnärligt

  • Resultat 1-10 av 39
Sortera/gruppera träfflistan
   
NumreringReferensOmslagsbildHitta
1.
  •  
2.
  •  
3.
  •  
4.
  •  
5.
  •  
6.
  • Andersson, Martin, 1981- (författare)
  • A bilevel approach to parameter tuning of optimization algorithms using evolutionary computing : Understanding optimization algorithms through optimization
  • 2018
  • Doktorsavhandling (övrigt vetenskapligt/konstnärligt)abstract
    • Most optimization problems found in the real world cannot be solved using analytical methods. For these types of difficult optimization problems, an alternative approach is needed. Metaheuristics are a category of optimization algorithms that do not guarantee that an optimal solution will be found, but instead search for the best solutions using some general heuristics. Metaheuristics have been shown to be effective at finding “good-enough” solutions to a wide variety of difficult problems. Most metaheuristics involve control parameters that can be used to modify how the heuristics perform its search. This is necessary because different problems may require different search strategies to be solved effectively. The control parameters allow for the optimization algorithm to be adapted to the problem at hand. It is, however, difficult to predict what the optimal control parameters are for any given problem. The problem of finding these optimal control parameter values is known as parameter tuning and is the main topic of this thesis. This thesis uses a bilevel optimization approach to solve parameter tuning problems. In this approach, the parameter tuning problem itself is formulated as an optimization problem and solved with an optimization algorithm. The parameter tuning problem formulated as a bilevel optimization problem is challenging because of nonlinear objective functions, interacting variables, multiple local optima, and noise. However, it is in precisely this kind of difficult optimization problem that evolutionary algorithms, which are a subclass of metaheuristics, have been shown to be effective. That is the motivation for using evolutionary algorithms for the upper-level optimization (i.e. tuning algorithm) of the bilevel optimization approach. Solving the parameter tuning problem using a bilevel optimization approach is also computationally expensive, since a complete optimization run has to be completed for every evaluation of a set of control parameter values. It is therefore important that the tuning algorithm be as efficient as possible, so that the parameter tuning problem can be solved to a satisfactory level with relatively few evaluations. Even so, bilevel optimization experiments can take a long time to run on a single computer. There is, however, considerable parallelization potential in the bilevel optimization approach, since many of the optimizations are independent of one another. This thesis has three primary aims: first, to present a bilevel optimization framework and software architecture for parallel parameter tuning; second, to use this framework and software architecture to evaluate and configure evolutionary algorithms as tuners and compare them with other parameter tuning methods; and, finally, to use parameter tuning experiments to gain new insights into and understanding of how optimization algorithms work and how they can used be to their maximum potential. The proposed framework and software architecture have been implemented and deployed in more than one hundred computers running many thousands of parameter tuning experiments for many millions of optimizations. This illustrates that this design and implementation approach can handle large parameter tuning experiments. Two types of evolutionary algorithms, i.e. differential evolution (DE) and a genetic algorithm (GA), have been evaluated as tuners against the parameter tuning algorithm irace. The as pects of algorithm configuration and noise handling for DE and the GA as related to the parameter tuning problem were also investigated. The results indicate that dynamic resampling strategies outperform static resampling strategies. It was also shown that the GA needs an explicit exploration and exploitation strategy in order not become stuck in local optima. The comparison with irace shows that both DE and the GA can significantly outperform it in a variety of different tuning problems.
  •  
7.
  • Andersson, Martin, 1981-, et al. (författare)
  • A Parallel Computing Software Architecture for the Bilevel Parameter Tuning of Optimization Algorithms
  • Annan publikation (övrigt vetenskapligt/konstnärligt)abstract
    • Most optimization algorithms extract important algorithmic design decisions as control parameters. This is necessary because different problems can require different search strategies to be solved effectively. The control parameters allow for the optimization algorithm to be adapted to the problem at hand. It is however difficult to predict what the optimal control parameters are for any given problem. Finding these optimal control parameter values is referred to as the parameter tuning problem. One approach of solving the parameter tuning problem is to use a bilevel optimization where the parameter tuning problem itself is formulated as an optimization problem involving algorithmic performance as the objective(s). In this paper, we present a framework and architecture that can be used to solve large-scale parameter tuning problems using a bilevel optimization approach. The proposed framework is used to show that evolutionary algorithms are competitive as tuners against irace which is a state-of-the-art tuning method. Two evolutionary algorithms, differential evaluation (DE) and a genetic algorithm (GA) are evaluated as tuner algorithms using the proposed framework and software architecture. The importance of replicating optimizations and avoiding local optima is also investigated. The architecture is deployed and tested by running millions of optimizations using a computing cluster. The results indicate that the evolutionary algorithms can consistently find better control parameter values than irace. The GA, however, needs to be configured for an explicit exploration and exploitation strategy in order avoid local optima.
  •  
8.
  • Andersson, Martin, 1981-, et al. (författare)
  • On the Trade-off Between Runtime and Evaluation Efficiency In Evolutionary Algorithms
  • Annan publikation (övrigt vetenskapligt/konstnärligt)abstract
    • Evolutionary optimization algorithms typically use one or more parameters that control their behavior. These parameters, which are often kept constant, can be tuned to improve the performance of the algorithm on specific problems.  However, past studies have indicated that the performance can be further improved by adapting the parameters during runtime. A limitation of these studies is that they only control, at most, a few parameters, thereby missing potentially beneficial interactions between them. Instead of finding a direct control mechanism, the novel approach in this paper is to use different parameter sets in different stages of an optimization. These multiple parameter sets, which remain static within each stage, are tuned through extensive bi-level optimization experiments that approximate the optimal adaptation of the parameters. The algorithmic performance obtained with tuned multiple parameter sets is compared against that obtained with a single parameter set.  For the experiments in this paper, the parameters of NSGAII are tuned when applied to the ZDT, DTLZ and WFG test problems. The results show that using multiple parameter sets can significantly increase the performance over a single parameter set.
  •  
9.
  •  
10.
  •  
Skapa referenser, mejla, bekava och länka
  • Resultat 1-10 av 39

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Stäng

Kopiera och spara länken för att återkomma till aktuell vy