SwePub
Sök i SwePub databas

  Utökad sökning

Träfflista för sökning "WFRF:(Martinez Mayorquin Ramon Heberto) "

Sökning: WFRF:(Martinez Mayorquin Ramon Heberto)

  • Resultat 1-4 av 4
Sortera/gruppera träfflistan
   
NumreringReferensOmslagsbildHitta
1.
  •  
2.
  • Martinez Mayorquin, Ramon Heberto, et al. (författare)
  • Probabilistic associative learning suffices for learning the temporal structure of multiple sequences
  • 2019
  • Ingår i: PLOS ONE. - : PUBLIC LIBRARY SCIENCE. - 1932-6203. ; 14:8
  • Tidskriftsartikel (refereegranskat)abstract
    • From memorizing a musical tune to navigating a well known route, many of our underlying behaviors have a strong temporal component. While the mechanisms behind the sequential nature of the underlying brain activity are likely multifarious and multi-scale, in this work we attempt to characterize to what degree some of this properties can be explained as a consequence of simple associative learning. To this end, we employ a parsimonious firing-rate attractor network equipped with the Hebbian-like Bayesian Confidence Propagating Neural Network (BCPNN) learning rule relying on synaptic traces with asymmetric temporal characteristics. The proposed network model is able to encode and reproduce temporal aspects of the input, and offers internal control of the recall dynamics by gain modulation. We provide an analytical characterisation of the relationship between the structure of the weight matrix, the dynamical network parameters and the temporal aspects of sequence recall. We also present a computational study of the performance of the system under the effects of noise for an extensive region of the parameter space. Finally, we show how the inclusion of modularity in our network structure facilitates the learning and recall of multiple overlapping sequences even in a noisy regime.
  •  
3.
  • Martinez Mayorquin, Ramon Heberto, et al. (författare)
  • Sequence Disambiguation with Synaptic Traces in Associative Neural Networks
  • 2019
  • Ingår i: 28th International Conference on Artificial Neural Networks, ICANN 2019. - Cham : Springer Nature. ; , s. 793-805
  • Konferensbidrag (refereegranskat)abstract
    • Among the abilities that a sequence processing network should possess sequence disambiguation, that is, the ability to let temporal context information influence the evolution of the network dynamics, is one of the most important. In this work we propose an instance of the Bayesian Confidence Propagation Neural Network (BCPNN) that learns sequences with probabilistic associative learning and is able to disambiguate sequences with the use of synaptic traces (low pass filtered versions of the activity). We describe first how the BCPNN achieves both sequence recall and sequence learning from temporal input. Our main result is that the BCPNN network equipped with dynamical memory in the form of synaptic traces is capable of solving the sequence disambiguation problem in a reliable way. We characterize the relationship between the sequence disambiguation capabilities of the network and its dynamical parameters. Furthermore, we show that the inclusion of an additional fast synaptic trace greatly increases the network disambiguation capabilities.
  •  
4.
  • Martinez Mayorquin, Ramon Heberto (författare)
  • Sequence learning in the Bayesian Confidence Propagation Neural Network
  • 2022
  • Doktorsavhandling (övrigt vetenskapligt/konstnärligt)abstract
    • This thesis examines sequence learning in the Bayesian Confidence PropagationNeural Network (BCPNN). The methodology utilized throughout this work is com-putational and analytical in nature and the contributions here presented can beunderstood along the following four major themes: 1) this work starts by revisitingthe properties of the BCPNN as an attractor neural network and then provides anovel formalization of some of those properties. First, a bayesian theoretical frame-work for the lower bounds in the BCPNN. Second, a differential formulation ofthe BCPNN plasticity rule that highlights its relationship to similar rules in thelearning literature. Third, closed form analytical results for the BCPNN trainingprocess. 2) After that, this work describes how the addition of an adaptation processto the BCPNN enables its sequence recall capabilities. The specific mechanisms ofsequence learning are then studied in detail as well as the properties of sequencerecall such as the persistence time (how long does the network last in a specific stateduring sequence recall) and its robustness to noise. 3) This work also shows howthe BCPNN can be enhanced with memory traces of the activity (z-traces) to pro-vide the network with disambiguation capabilities. 4) Finally, this works provides acomputational study to quantify the number of the sequences that the BCPNN canstore successfully. Alongside these central themes, results concerning robustness,stability and the relationship between the learned patterns and the input statisticsare presented in either computational or analytical form. The thesis concludes witha discussion of the sequence learning capabilities of the BCPNN in the context of thewider literature and describes both his advantages and disadvantages with respectto other attractor neural networks.
  •  
Skapa referenser, mejla, bekava och länka
  • Resultat 1-4 av 4

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Stäng

Kopiera och spara länken för att återkomma till aktuell vy