SwePub
Sök i SwePub databas

  Extended search

Träfflista för sökning "WFRF:(Syren Andreas) "

Search: WFRF:(Syren Andreas)

  • Result 1-3 of 3
Sort/group result
   
EnumerationReferenceCoverFind
1.
  • Andersson, Viktor, 1995, et al. (author)
  • Controlled Decent Training
  • 2023
  • Journal article (other academic/artistic)abstract
    • In this work, a novel and model-based artificial neural network (ANN) training method is developed supported by optimal control theory. The method augments training labels in order to robustly guarantee training loss convergence and improve training convergence rate. Dynamic label augmentation is proposed within the framework of gradient descent training where the convergence of training loss is controlled. First, we capture the training behavior with the help of empirical Neural Tangent Kernels (NTK) and borrow tools from systems and control theory to analyze both the local and global training dynamics (e.g. stability, reachability). Second, we propose to dynamically alter the gradient descent training mechanism via fictitious labels as control inputs and an optimal state feedback policy. In this way, we enforce locally H2 optimal and convergent training behavior. The novel algorithm, Controlled Descent Training (CDT), guarantees local convergence. CDT unleashes new potentials in the analysis, interpretation, and design of ANN architectures. The applicability of the method is demonstrated on standard regression and classification problems.
  •  
2.
  • Andersson, Viktor, 1995, et al. (author)
  • Controlled Descent Training
  • 2024
  • In: International Journal of Robust and Nonlinear Control. - 1099-1239 .- 1049-8923.
  • Journal article (peer-reviewed)abstract
    • In this work, a novel and model-based artificial neural network (ANN) training method is developed supported by optimal control theory. The method augments training labels in order to robustly guarantee training loss convergence and improve training convergence rate. Dynamic label augmentation is proposed within the framework of gradient descent training where the convergence of training loss is controlled. First, we capture the training behavior with the help of empirical Neural Tangent Kernels (NTK) and borrow tools from systems and control theory to analyze both the local and global training dynamics (e.g. stability, reachability). Second, we propose to dynamically alter the gradient descent training mechanism via fictitious labels as control inputs and an optimal state feedback policy. In this way, we enforce locally H2 optimal and convergent training behavior. The novel algorithm, Controlled Descent Training (CDT), guarantees local convergence. CDT unleashes new potentials in the analysis, interpretation, and design of ANN architectures. The applicability of the method is demonstrated on standard regression and classification problems.
  •  
3.
  • Nebel, Bernd A., et al. (author)
  • A Career in Catalysis : Bernhard Hauer
  • 2023
  • In: ACS Catalysis. - : American Chemical Society (ACS). - 2155-5435. ; 13:13, s. 8861-8889
  • Research review (peer-reviewed)abstract
    • On the occasion of Professor Bernhard Hauer′s (partial) retirement, we reflect on and highlight his distinguished career in biocatalysis. Bernhard, a biologist by training, has greatly influenced biocatalysis with his vision and ideas throughout his four-decade career. The development of his career went hand in hand with the evolution of biocatalysis and the application and development of enzymes for chemical processes. In this Account, we present selected examples of his early work on the development of enzymes and their application in an industrial setting, with a focus on his specific contributions to harnessing the catalytic power of enzymes for novel reactions and the understanding and engineering of flexible loops and channels on catalysis.
  •  
Skapa referenser, mejla, bekava och länka
  • Result 1-3 of 3

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Close

Copy and save the link in order to return to this view