SwePub
Tyck till om SwePub Sök här!
Sök i SwePub databas

  Utökad sökning

Träfflista för sökning "WFRF:(Ljung Lennart 1946 ) ;pers:(Sjöberg Jonas)"

Sökning: WFRF:(Ljung Lennart 1946 ) > Sjöberg Jonas

  • Resultat 1-10 av 15
Sortera/gruppera träfflistan
   
NumreringReferensOmslagsbildHitta
1.
  • Hjalmarsson, Håkan, 1962-, et al. (författare)
  • On Neural Network Model Structure in System Identification
  • 1996
  • Ingår i: Identification, Adaptation, Learning. The Science of Learning Models from Data. - Linköping : Linköping University Electronic Press. ; , s. 366-399
  • Rapport (övrigt vetenskapligt/konstnärligt)
  •  
2.
  • Juditsky, A., et al. (författare)
  • Nonlinear black-box models in system identification: Mathematical foundations
  • 1995
  • Ingår i: Automatica. - Linköping : Elsevier BV. - 0005-1098 .- 1873-2836. ; 31:12, s. 1725-1750
  • Tidskriftsartikel (refereegranskat)abstract
    • We discuss several aspects of the mathematical foundations of the nonlinear black-box identification problem. We shall see that the quality of the identification procedure is always a result of a certain trade-off between the expressive power of the model we try to identify (the larger the number of parameters used to describe the model, the more flexible is the approximation), and the stochastic error (which is proportional to the number of parameters). A consequence of this trade-off is the simple fact that a good approximation technique can be the basis of a good identification algorithm. From this point of view, we consider different approximation methods, and pay special attention to spatially adaptive approximants. We introduce wavelet and 'neuron' approximations, and show that they are spatially adaptive. Then we apply the acquired approximation experience to estimation problems. Finally, we consider some implications of these theoretical developments for the practically implemented versions of the 'spatially adaptive' algorithms. Copyright © 1995 Elsevier Science Ltd All rights reserved.
  •  
3.
  • Ljung, Lennart, 1946-, et al. (författare)
  • A Comment on Leakage in Adaptive Algorithms
  • 1992
  • Ingår i: Proceedings of the 4th IFAC International Symposium on Adaptive Systems in Control and Signal Processing. - Linköping : Linköping University. - 9780080425962 ; , s. 377-382
  • Konferensbidrag (refereegranskat)abstract
    • By "leakage" in adaptive control and adaptive signal processing algorithm is understood that a pull term towards a given parameter value is introduced. Leakage has been introduced both as trick to be able to prove certain convergence results as an ad hoc means for obtaining less drifting parameters. Leakage is the same as regularization and we explain what benefits - from an estimation point of view - this gives.
  •  
4.
  • Ljung, Lennart, 1946-, et al. (författare)
  • A Comment on Leakage in Adaptive Algorithms
  • 1991
  • Rapport (övrigt vetenskapligt/konstnärligt)abstract
    • By "leakage" in adaptive control and adaptive signal processing algorithm is understood that a pull term towards a given parameter value is introduced. Leakage has been introduced both as trick to be able to prove certain convergence results as an ad hoc means for obtaining less drifting parameters. Leakage is the same as regularization and we explain what benefits - from an estimation point of view - this gives.
  •  
5.
  • Ljung, Lennart, 1946-, et al. (författare)
  • A System Identification Perspective on Neural Nets
  • 1992
  • Ingår i: Proceedings of the 1992 IEEE Workshop on Neural Networks for Signal Processing. - Linköping : Linköping University. - 0780305574 ; , s. 423-435
  • Rapport (övrigt vetenskapligt/konstnärligt)abstract
    • The authors review some of the basic system identification machinery to reveal connections with neural networks. In particular, they point to the role of regularization in dealing with model structures with many parameters, and show the links to overtraining in neural nets. Some provisional explanations for the success of neural nets are also offered.
  •  
6.
  •  
7.
  • Ljung, Lennart, 1946-, et al. (författare)
  • On the Use of Regularization in System Identification
  • 1993
  • Ingår i: Proceedings of the 12th IFAC World Congress. - 9780080422121 ; , s. 381-386
  • Konferensbidrag (refereegranskat)abstract
    • Regularization is a standard statistical technique to deal with ill-conditioned parameter estimation problems. We discuss in this contribution what possibilities and advantages regularization offers in system identification. In the first place regularization reduces the variance error of a model, but at the same time it introduces a bias. The familiar trade-off between bias and variance error for the choice of model order/structure can therefore be discussed in terms of the regularization parameter. We also show how the well-known problem of parametrizing multivariable system can be dealt with using overparametrization plus regularization. A characteristic feature for this way of letting the parametrization/model structure/model order be solved by regularization is that it is an easy and "automatic" way of finding the important parameters and good parametrization. No statistical penalty is paid for the overparametrization, but there is a penalty of higher computational burden.
  •  
8.
  • Ljung, Lennart, 1946-, et al. (författare)
  • On the Use of Regularization in System Identification
  • 1992
  • Rapport (övrigt vetenskapligt/konstnärligt)abstract
    • Regularization is a standard statistical technique to deal with ill-conditioned parameter estimation problems. We discuss in this contribution what possibilities and advantages regularization offers in system identification. In the first place regularization reduces the variance error of a model, but at the same time it introduces a bias. The familiar trade-off between bias and variance error for the choice of model order/structure can therefore be discussed in terms of the regularization parameter. We also show how the well-known problem of parametrizing multivariable system can be dealt with using overparametrization plus regularization. A characteristic feature for this way of letting the parametrization/model structure/model order be solved by regularization is that it is an easy and "automatic" way of finding the important parameters and good parametrization. No statistical penalty is paid for the overparametrization, but there is a penalty of higher computational burden.
  •  
9.
  • Ljung, Lennart, 1946-, et al. (författare)
  • Overtraining, Regularization and Searching for Minimum in Neural Networks
  • 1992
  • Ingår i: 4th IFAC Symposium on Adaptive Systems in Control and Signal Processing. - 9780080425962 ; , s. 669-674
  • Konferensbidrag (refereegranskat)abstract
    • Neural network models for dynamical systems have been subject of considerable interest lately. They are often characterized by the fact that they use a fairly large amount of parameters. Here we address the problem why this can be done without the usual penalty in terms of a large variance error. We show that reguralization is a key explanation, and that terminating a gradient search ("backpropagation") before the true criterion minimum is found is a way of achieving regularization. This, among other things, also explains the concept of "overtraining" in neural nets.
  •  
10.
  • Ljung, Lennart, 1946-, et al. (författare)
  • Overtraining, Regularization and Searching for Minimum in Neural Networks
  • 1991
  • Rapport (övrigt vetenskapligt/konstnärligt)abstract
    • Neural network models for dynamical systems have been subject of considerable interest lately. They are often characterized by the fact that they use a fairly large amount of parameters. Here we address the problem why this can be done without the usual penalty in terms of a large variance error. We show that reguralization is a key explanation, and that terminating a gradient search ("backpropagation") before the true criterion minimum is found is a way of achieving regularization. This, among other things, also explains the concept of "overtraining" in neural nets.
  •  
Skapa referenser, mejla, bekava och länka
  • Resultat 1-10 av 15

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Stäng

Kopiera och spara länken för att återkomma till aktuell vy