SwePub
Sök i SwePub databas

  Utökad sökning

Träfflista för sökning "WFRF:(Kunkel Susanne) "

Sökning: WFRF:(Kunkel Susanne)

  • Resultat 1-6 av 6
Sortera/gruppera träfflistan
   
NumreringReferensOmslagsbildHitta
1.
  • Hahne, J., et al. (författare)
  • Including gap junctions into distributed neuronal network simulations
  • 2016
  • Ingår i: 2nd International Workshop on Brain-Inspired Computing, BrainComp 2015. - Cham : Springer Publishing Company. - 9783319508610 ; , s. 43-57
  • Konferensbidrag (refereegranskat)abstract
    • Contemporary simulation technology for neuronal networks enables the simulation of brain-scale networks using neuron models with a single or a few compartments. However, distributed simulations at full cell density are still lacking the electrical coupling between cells via so called gap junctions. This is due to the absence of efficient algorithms to simulate gap junctions on large parallel computers. The difficulty is that gap junctions require an instantaneous interaction between the coupled neurons, whereas the efficiency of simulation codes for spiking neurons relies on delayed communication. In a recent paper [15] we describe a technology to overcome this obstacle. Here, we give an overview of the challenges to include gap junctions into a distributed simulation scheme for neuronal networks and present an implementation of the new technology available in the NEural Simulation Tool (NEST 2.10.0). Subsequently we introduce the usage of gap junctions in model scripts as well as benchmarks assessing the performance and overhead of the technology on the supercomputers JUQUEEN and K computer.
  •  
2.
  •  
3.
  • Jordan, Jakob, et al. (författare)
  • Extremely Scalable Spiking Neuronal Network Simulation Code : From Laptops to Exascale Computers
  • 2018
  • Ingår i: Frontiers in Neuroinformatics. - : Frontiers Media SA. - 1662-5196. ; 12
  • Tidskriftsartikel (refereegranskat)abstract
    • State-of-the-art software tools for neuronal network simulations scale to the largest computing systems available today and enable investigations of large-scale networks of up to 10 % of the human cortex at a resolution of individual neurons and synapses. Due to an upper limit on the number of incoming connections of a single neuron, network connectivity becomes extremely sparse at this scale. To manage computational costs, simulation software ultimately targeting the brain scale needs to fully exploit this sparsity. Here we present a two-tier connection infrastructure and a framework for directed communication among compute nodes accounting for the sparsity of brain-scale networks. We demonstrate the feasibility of this approach by implementing the technology in the NEST simulation code and we investigate its performance in different scaling scenarios of typical network simulations. Our results show that the new data structures and communication scheme prepare the simulation kernel for post-petascale high-performance computing facilities without sacrificing performance in smaller systems.
  •  
4.
  • Kunkel, Susanne, et al. (författare)
  • The NEST Dry-Run Mode : Efficient Dynamic Analysis of Neuronal Network Simulation Code
  • 2017
  • Ingår i: Frontiers in Neuroinformatics. - : Frontiers Media SA. - 1662-5196. ; 11
  • Tidskriftsartikel (refereegranskat)abstract
    • NEST is a simulator for spiking neuronal networks that commits to a general purpose approach: It allows for high flexibility in the design of network models, and its applications range from small-scale simulations on laptops to brain-scale simulations on supercomputers. Hence, developers need to test their code for various use cases and ensure that changes to code do not impair scalability. However, running a full set of benchmarks on a supercomputer takes up precious compute-time resources and can entail long queuing times. Here, we present the NEST dry-run mode, which enables comprehensive dynamic code analysis without requiring access to high-performance computing facilities. A dry-run simulation is carried out by a single process, which performs all simulation steps except communication as if it was part of a parallel environment with many processes. We show that measurements of memory usage and runtime of neuronal network simulations closely match the corresponding dry-run data. Furthermore, we demonstrate the successful application of the dry-run mode in the areas of profiling and performance modeling.
  •  
5.
  • Pauli, Robin, et al. (författare)
  • Reproducing Polychronization : A Guide to Maximizing the Reproducibility of Spiking Network Models
  • 2018
  • Ingår i: Frontiers in Neuroinformatics. - : Frontiers Media S.A.. - 1662-5196. ; 12
  • Tidskriftsartikel (refereegranskat)abstract
    • Any modeler who has attempted to reproduce a spiking neural network model from its description in a paper has discovered what a painful endeavor this is. Even when all parameters appear to have been specified, which is rare, typically the initial attempt to reproduce the network does not yield results that are recognizably akin to those in the original publication. Causes include inaccurately reported or hidden parameters (e.g., wrong unit or the existence of an initialization distribution), differences in implementation of model dynamics, and ambiguities in the text description of the network experiment. The very fact that adequate reproduction often cannot be achieved until a series of such causes have been tracked down and resolved is in itself disconcerting, as it reveals unreported model dependencies on specific implementation choices that either were not clear to the original authors, or that they chose not to disclose. In either case, such dependencies diminish the credibility of the model's claims about the behavior of the target system. To demonstrate these issues, we provide a worked example of reproducing a seminal study for which, unusually, source code was provided at time of publication. Despite this seemingly optimal starting position, reproducing the results was time consuming and frustrating. Further examination of the correctly reproduced model reveals that it is highly sensitive to implementation choices such as the realization of background noise, the integration timestep, and the thresholding parameter of the analysis algorithm. From this process, we derive a guideline of best practices that would substantially reduce the investment in reproducing neural network studies, whilst simultaneously increasing their scientific quality. We propose that this guideline can be used by authors and reviewers to assess and improve the reproducibility of future network models.
  •  
6.
  • Svensson, Torbjörn, et al. (författare)
  • Sweden.
  • 2009
  • Ingår i: The International Handbook on Aging.. - 9780313352300 ; , s. 521-537
  • Bokkapitel (populärvet., debatt m.m.)
  •  
Skapa referenser, mejla, bekava och länka
  • Resultat 1-6 av 6

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Stäng

Kopiera och spara länken för att återkomma till aktuell vy