SwePub
Sök i SwePub databas

  Utökad sökning

Träfflista för sökning "WFRF:(Lansner Anders Professor 1949 ) srt2:(2023)"

Sökning: WFRF:(Lansner Anders Professor 1949 ) > (2023)

  • Resultat 1-4 av 4
Sortera/gruppera träfflistan
   
NumreringReferensOmslagsbildHitta
1.
  • Lansner, Anders, Professor, 1949-, et al. (författare)
  • Fast Hebbian plasticity and working memory
  • 2023
  • Ingår i: Current Opinion in Neurobiology. - : Elsevier Ltd. - 0959-4388 .- 1873-6882. ; 83
  • Tidskriftsartikel (refereegranskat)abstract
    • Theories and models of working memory (WM) were at least since the mid-1990s dominated by the persistent activity hypothesis. The past decade has seen rising concerns about the shortcomings of sustained activity as the mechanism for short-term maintenance of WM information in the light of accumulating experimental evidence for so-called activity-silent WM and the fundamental difficulty in explaining robust multi-item WM. In consequence, alternative theories are now explored mostly in the direction of fast synaptic plasticity as the underlying mechanism. The question of non-Hebbian vs Hebbian synaptic plasticity emerges naturally in this context. In this review, we focus on fast Hebbian plasticity and trace the origins of WM theories and models building on this form of associative learning.
  •  
2.
  •  
3.
  • Ravichandran, Naresh Balaji, et al. (författare)
  • Brain-like Combination of Feedforward and Recurrent Network Components Achieves Prototype Extraction and Robust Pattern Recognition
  • 2023
  • Ingår i: Lecture Notes in Computer Science. - Cham : Springer Nature. ; , s. 488-501
  • Konferensbidrag (refereegranskat)abstract
    • Associative memory has been a prominent candidate for the computation performed by the massively recurrent neocortical networks. Attractor networks implementing associative memory have offered mechanistic explanation for many cognitive phenomena. However, attractor memory models are typically trained using orthogonal or random patterns to avoid interference between memories, which makes them unfeasible for naturally occurring complex correlated stimuli like images. We approach this problem by combining a recurrent attractor network with a feedforward network that learns distributed representations using an unsupervised Hebbian-Bayesian learning rule. The resulting network model incorporates many known biological properties: unsupervised learning, Hebbian plasticity, sparse distributed activations, sparse connectivity, columnar and laminar cortical architecture, etc. We evaluate the synergistic effects of the feedforward and recurrent network components in complex pattern recognition tasks on the MNIST handwritten digits dataset. We demonstrate that the recurrent attractor component implements associative memory when trained on the feedforward-driven internal (hidden) representations. The associative memory is also shown to perform prototype extraction from the training data and make the representations robust to severely distorted input. We argue that several aspects of the proposed integration of feedforward and recurrent computations are particularly attractive from a machine learning perspective.
  •  
4.
  • Wang, Deyu, et al. (författare)
  • A Memristor-Based Learning Engine for Synaptic Trace-Based Online Learning
  • 2023
  • Ingår i: IEEE Transactions on Biomedical Circuits and Systems. - : Institute of Electrical and Electronics Engineers (IEEE). - 1932-4545 .- 1940-9990. ; 17:5, s. 1153-1165
  • Tidskriftsartikel (refereegranskat)abstract
    • The memristor has been extensively used to facilitate the synaptic online learning of brain-inspired spiking neural networks (SNNs). However, the current memristor-based work can not support the widely used yet sophisticated trace-based learning rules, including the trace-based Spike-Timing-Dependent Plasticity (STDP) and the Bayesian Confidence Propagation Neural Network (BCPNN) learning rules. This paper proposes a learning engine to implement trace-based online learning, consisting of memristor-based blocks and analog computing blocks. The memristor is used to mimic the synaptic trace dynamics by exploiting the nonlinear physical property of the device. The analog computing blocks are used for the addition, multiplication, logarithmic and integral operations. By organizing these building blocks, a reconfigurable learning engine is architected and realized to simulate the STDP and BCPNN online learning rules, using memristors and 180 nm analog CMOS technology. The results show that the proposed learning engine can achieve energy consumption of 10.61 pJ and 51.49 pJ per synaptic update for the STDP and BCPNN learning rules, respectively, with a 147.03× and 93.61× reduction compared to the 180 nm ASIC counterparts, and also a 9.39× and 5.63× reduction compared to the 40 nm ASIC counterparts. Compared with the state-of-the-art work of Loihi and eBrainII, the learning engine can reduce the energy per synaptic update by 11.31× and 13.13× for trace-based STDP and BCPNN learning rules, respectively.
  •  
Skapa referenser, mejla, bekava och länka
  • Resultat 1-4 av 4

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Stäng

Kopiera och spara länken för att återkomma till aktuell vy