SwePub
Sök i SwePub databas

  Utökad sökning

Träfflista för sökning "WFRF:(Rachkovskij Dmitri A.) "

Sökning: WFRF:(Rachkovskij Dmitri A.)

  • Resultat 1-9 av 9
Sortera/gruppera träfflistan
   
NumreringReferensOmslagsbildHitta
1.
  • Grytsenko, Vladimir I., et al. (författare)
  • Neural Distributed Autoassociative Memories: A Survey.
  • 2017
  • Ingår i: Cybernetics and Computer Engineering Journal. - : NASU-National Academy of Sciences of Ukraine. - 0454-9910 .- 2519-2205. ; 188:2, s. 5-35
  • Tidskriftsartikel (refereegranskat)abstract
    • Introduction. Neural network models of autoassociative, distributed memory allow storage and retrieval of many items (vectors) where the number of stored items can exceed the vector dimension (the number of neurons in the network). This opens the possibility of a sublinear time search (in the number of stored items) for approximate nearest neighbors among vectors of high dimension.The purpose of this paper is to review models of autoassociative, distributed memory that can be naturally implemented by neural networks (mainly with local learning rules and iterative dynamics based on information locally available to neurons).Scope. The survey is focused mainly on the networks of Hopfield, Willshaw and Potts, that have connections between pairs of neurons and operate on sparse binary vectors. We discuss not only autoassociative memory, but also the generalization properties of these networks. We also consider neural networks with higher-order connections and networks with a bipartite graph structure for non-binary data with linear constraints.Conclusions. In conclusion we discuss the relations to similarity search, advantages and drawbacks of these techniques, and topics for further research. An interesting and still not completely resolved question is whether neural autoassociative memories can search for approximate nearest neighbors faster than other index structures for similarity search, in particular for the case of very high dimensional vectors. 
  •  
2.
  • Kleyko, Denis, et al. (författare)
  • A Survey on Hyperdimensional Computing aka Vector Symbolic Architectures, Part I: Models and Data Transformations
  • 2023
  • Ingår i: ACM Computing Surveys. - : Association for Computing Machinery (ACM). - 0360-0300 .- 1557-7341. ; 55:6
  • Tidskriftsartikel (refereegranskat)abstract
    • This two-part comprehensive survey is devoted to a computing framework most commonly known under the names Hyperdimensional Computing and Vector Symbolic Architectures (HDC/VSA). Both names refer to a family of computational models that use high-dimensional distributed representations and rely on the algebraic properties of their key operations to incorporate the advantages of structured symbolic representations and distributed vector representations. Notable models in the HDC/VSA family are Tensor Product Representations, Holographic Reduced Representations, Multiply-Add-Permute, Binary Spatter Codes, and Sparse Binary Distributed Representations but there are other models too. HDC/VSA is a highly interdisciplinary field with connections to computer science, electrical engineering, artificial intelligence, mathematics, and cognitive science. This fact makes it challenging to create a thorough overview of the field. However, due to a surge of new researchers joining the field in recent years, the necessity for a comprehensive survey of the field has become extremely important. Therefore, amongst other aspects of the field, this Part I surveys important aspects such as: known computational models of HDC/VSA and transformations of various input data types to high-dimensional distributed representations. Part II of this survey [84] is devoted to applications, cognitive computing and architectures, as well as directions for future work. The survey is written to be useful for both newcomers and practitioners.
  •  
3.
  • Kleyko, Denis, et al. (författare)
  • A Survey on Hyperdimensional Computing aka Vector Symbolic Architectures, Part II : Applications, Cognitive Models, and Challenges
  • 2023
  • Ingår i: ACM Computing Surveys. - : Association for Computing Machinery. - 0360-0300 .- 1557-7341. ; 55:9
  • Tidskriftsartikel (refereegranskat)abstract
    • This is Part II of the two-part comprehensive survey devoted to a computing framework most commonly known under the names Hyperdimensional Computing and Vector Symbolic Architectures (HDC/VSA). Both names refer to a family of computational models that use high-dimensional distributed representations and rely on the algebraic properties of their key operations to incorporate the advantages of structured symbolic representations and vector distributed representations. Holographic Reduced Representations [321, 326] is an influential HDC/VSA model that is well known in the machine learning domain and often used to refer to the whole family. However, for the sake of consistency, we use HDC/VSA to refer to the field.Part I of this survey [222] covered foundational aspects of the field, such as the historical context leading to the development of HDC/VSA, key elements of any HDC/VSA model, known HDC/VSA models, and the transformation of input data of various types into high-dimensional vectors suitable for HDC/VSA. This second part surveys existing applications, the role of HDC/VSA in cognitive computing and architectures, as well as directions for future work. Most of the applications lie within the Machine Learning/Artificial Intelligence domain; however, we also cover other applications to provide a complete picture. The survey is written to be useful for both newcomers and practitioners. 
  •  
4.
  • Kleyko, Denis, 1990-, et al. (författare)
  • Classification and Recall With Binary Hyperdimensional Computing : Tradeoffs in Choice of Density and Mapping Characteristics
  • 2018
  • Ingår i: IEEE Transactions on Neural Networks and Learning Systems. - : IEEE. - 2162-237X .- 2162-2388. ; 29:12, s. 5880-5898
  • Tidskriftsartikel (refereegranskat)abstract
    • Hyperdimensional (HD) computing is a promising paradigm for future intelligent electronic appliances operating at low power. This paper discusses tradeoffs of selecting parameters of binary HD representations when applied to pattern recognition tasks. Particular design choices include density of representations and strategies for mapping data from the original representation. It is demonstrated that for the considered pattern recognition tasks (using synthetic and real-world data) both sparse and dense representations behave nearly identically. This paper also discusses implementation peculiarities which may favor one type of representations over the other. Finally, the capacity of representations of various densities is discussed.
  •  
5.
  • Kleyko, Denis, et al. (författare)
  • Modification of Holographic Graph Neuron using Sparse Distributed Representations
  • 2016
  • Ingår i: Procedia Computer Science. - : Elsevier. - 1877-0509. ; 88, s. 39-45
  • Tidskriftsartikel (refereegranskat)abstract
    • This article presents a modification of the recently proposed Holographic Graph Neuron approach for memorizing patterns of generic sensor stimuli. The original approach represents patterns as dense binary vectors, where zeros and ones are equiprobable. The presented modification employs sparse binary distributed representations where the number of ones is less than zeros. Sparse representations are more biologically plausible because activities of real neuronsare sparse. Performance was studied comparing approaches for different sizes of dimensionality.
  •  
6.
  • Kleyko, Denis, et al. (författare)
  • Vector Symbolic Architectures as a Computing Framework for Emerging Hardware
  • 2022
  • Ingår i: Proceedings of the IEEE. - : Institute of Electrical and Electronics Engineers Inc.. - 0018-9219 .- 1558-2256. ; 110:10, s. 1538-1571
  • Tidskriftsartikel (refereegranskat)abstract
    • This article reviews recent progress in the development of the computing framework vector symbolic architectures (VSA) (also known as hyperdimensional computing). This framework is well suited for implementation in stochastic, emerging hardware, and it naturally expresses the types of cognitive operations required for artificial intelligence (AI). We demonstrate in this article that the field-like algebraic structure of VSA offers simple but powerful operations on high-dimensional vectors that can support all data structures and manipulations relevant to modern computing. In addition, we illustrate the distinguishing feature of VSA, 'computing in superposition,' which sets it apart from conventional computing. It also opens the door to efficient solutions to the difficult combinatorial search problems inherent in AI applications. We sketch ways of demonstrating that VSA are computationally universal. We see them acting as a framework for computing with distributed representations that can play a role of an abstraction layer for emerging computing hardware. This article serves as a reference for computer architects by illustrating the philosophy behind VSA, techniques of distributed computing with them, and their relevance to emerging computing hardware, such as neuromorphic computing. 
  •  
7.
  • Rachkovskij, Dmitri A., et al. (författare)
  • Recursive Binding for Similarity-Preserving Hypervector Representations of Sequences
  • 2022
  • Ingår i: 2022 International Joint Conference on Neural Networks (IJCNN). - : IEEE.
  • Konferensbidrag (refereegranskat)abstract
    • Hyperdimensional computing (HDC), also known as vector symbolic architectures (VSA), is a computing framework used within artificial intelligence and cognitive computing that operates with distributed vector representations of large fixed dimensionality. A critical step in designing the HDC/VSA solutions is to obtain such representations from the input data. Here, we focus on a wide-spread data type of sequences and propose their transformation to distributed representations that both preserve the similarity of identical sequence elements at nearby positions and are equivariant with respect to the sequence shift. These properties are enabled by forming representations of sequence positions using recursive binding as well as superposition operations. The proposed transformation was experimentally investigated with symbolic strings used for modeling human perception of word similarity. The obtained results are on a par with more sophisticated approaches from the literature. The proposed transformation was designed for the HDC/VSA model known as Fourier Holographic Reduced Representations. However, it can be adapted to some other HDC/VSA models.
  •  
8.
  • Rachkovskij, Dmitri A. (författare)
  • Representation of spatial objects by shift-equivariant similarity-preserving hypervectors
  • 2022
  • Ingår i: Neural Computing & Applications. - : Springer Nature. - 0941-0643 .- 1433-3058. ; 34:24, s. 22387-22403
  • Tidskriftsartikel (refereegranskat)abstract
    • Hyperdimensional Computing (HDC), also known as Vector-Symbolic Architectures (VSA), is an approach that has been proposed to combine the advantages of distributed vector representations and symbolic structured data representations in Artificial Intelligence, Machine Learning, and Pattern Recognition problems. HDC/VSA operate with hypervectors, i.e., brain-like distributed representations of large fixed dimension. The key problem of HDC/VSA is how to transform data of various types into hypervectors. In this paper, we propose a novel approach for the formation of hypervectors of spatial objects, such as images, that provides both an equivariance with respect to the shift of objects and preserves the similarity of objects described by similar features at nearby positions. In contrast to known hypervector formation methods, we represent the features by compositional hypervectors and exploit permutations of hypervectors for representing the position of features. We experimentally explored the proposed approach in some tasks that exploit various descriptions of two-dimensional (2D) images. In terms of standard accuracy measures such as error rate or mean average precision, our results are on a par or better than those of other methods and are obtained without feature learning. The proposed techniques were designed for the HDC/VSA model known as Sparse Binary Distributed Representations. However, they can be adapted to hypervectors in formats of other HDC/VSA models, as well as for representing spatial objects other than 2D images.
  •  
9.
  • Rachkovskij, Dmitri A. (författare)
  • Shift-Equivariant Similarity-Preserving Hypervector Representations of Sequences
  • 2024
  • Ingår i: Cognitive Computation. - : Springer. - 1866-9956 .- 1866-9964. ; 16, s. 909-923
  • Tidskriftsartikel (refereegranskat)abstract
    • Hyperdimensional Computing (HDC), also known as Vector-Symbolic Architectures (VSA), is a promising framework for the development of cognitive architectures and artificial intelligence systems, as well as for technical applications and emerging neuromorphic and nanoscale hardware. HDC/VSA operate with hypervectors, i.e., neural-like distributed vector representations of large fixed dimension (usually > 1000). One of the key ingredients of HDC/VSA are the methods for encoding various data types (from numeric scalars and vectors to graphs) by hypervectors. In this paper, we propose an approach for the formation of hypervectors of sequences that provides both an equivariance with respect to the shift of sequences and preserves the similarity of sequences with identical elements at nearby positions. Our methods represent the sequence elements by compositional hypervectors and exploit permutations of hypervectors for representing the order of sequence elements. We experimentally explored the proposed representations using a diverse set of tasks with data in the form of symbolic strings. Although we did not use any features here (hypervector of a sequence was formed just from the hypervectors of its symbols at their positions), the proposed approach demonstrated the performance on a par with the methods that exploit various features, such as subsequences. The proposed techniques were designed for the HDC/VSA model known as Sparse Binary Distributed Representations. However, they can be adapted to hypervectors in formats of other HDC/VSA models, as well as for representing sequences of types other than symbolic strings. Directions for further research are discussed.
  •  
Skapa referenser, mejla, bekava och länka
  • Resultat 1-9 av 9

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Stäng

Kopiera och spara länken för att återkomma till aktuell vy