SwePub
Tyck till om SwePub Sök här!
Sök i SwePub databas

  Utökad sökning

Träfflista för sökning "WFRF:(Lambert J. C.) srt2:(2000-2004)"

Sökning: WFRF:(Lambert J. C.) > (2000-2004)

  • Resultat 1-4 av 4
Sortera/gruppera träfflistan
   
NumreringReferensOmslagsbildHitta
1.
  •  
2.
  • Spaanenburg, Lambert, et al. (författare)
  • Natural learning of neural networks by reconfiguration
  • 2003
  • Ingår i: SPIE Proceedings on Bioengineered and Bioinspired Systems. - : SPIE. - 1996-756X .- 0277-786X. ; 5119, s. 273-284
  • Konferensbidrag (refereegranskat)abstract
    • The communicational and computational demands of neural networks are hard to satisfy in a digital technology. Temporal computing solves this problem by iteration, but leaves a slow network. Spatial computing was no option until the coming of modern FPGA devices. The letter shows how a small feed-forward neural module can be configured on the limited logic blocks between RAM and multiplier macro’s. It is then described how by spatial unrolling or by reconfiguration a large modular ANN can be built from such modules.
  •  
3.
  • van der Zwaag, B.J., et al. (författare)
  • Extracting Knowledge from Neural Networks in Image Processing
  • 2003
  • Ingår i: Innovations in Knowledge Engineering. ; , s. 107-127
  • Bokkapitel (övrigt vetenskapligt/konstnärligt)abstract
    • Despite their success-story, artificial neural networks have one major disadvantagecompared to other techniques: the inability to explain comprehensively how a trainedneural network reaches its output; neural networks are not only (incorrectly) seen as a “magic tool” but possibly even more as a mysterious “black box.” Although much research has already been done to “open the box,” there is a notable hiatus in known publications on analysis of neural networks. So far, mainly sensitivity analysis and rule extraction methods have been used to analyze neural networks. However, these can only be applied in a limited subset of the problem domains where neural network solutions are encountered. In this chapter we propose a wider applicable method which, for a given problem domain, involves identifying basic functions with which users in that domain are already familiar, and describing trained neural networks, or parts thereof, in terms of those basic functions. This will provide a comprehensible description of the neural network’s function and, depending on the chosen base functions, it may also provide an insight into the neural network’s inner “reasoning.” To illustrate our method, the elements of a feedforward-backpropagation neural network, that has been trained to detect edges in images, are described in terms of differential operators of various orders and with various angles of operation. The results are then compared with image filters known from literature, which we analyzed in the same way.
  •  
4.
  • vanderZwaag, B J, et al. (författare)
  • Translating feed-forward nets to SOM-like maps
  • 2003
  • Ingår i: Proceedings ProRisc?03. - 9073461391 ; , s. 447-452
  • Konferensbidrag (refereegranskat)abstract
    • A major disadvantage of feedforward neural networks is still the difficulty to gain insight into their internal functionality. This is much less the case for, e.g., nets that are trained unsupervised, such as Kohonen’s self-organizing feature maps (SOMs). These offer a direct view into the stored knowledge, as their internal knowledge is stored in the same format as the input data that was used for training or is used for evaluation. This paper discusses a mathematical transformation of a feed-forward network into a SOMlike structure such that its internal knowledge can be visually interpreted. This is particularly applicable to networks trained in the general classification problem domain.
  •  
Skapa referenser, mejla, bekava och länka
  • Resultat 1-4 av 4

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Stäng

Kopiera och spara länken för att återkomma till aktuell vy