• Medientyp: Elektronischer Konferenzbericht; E-Artikel; Sonstige Veröffentlichung
  • Titel: Random Sketching, Clustering, and Short-Term Memory in Spiking Neural Networks
  • Beteiligte: Hitron, Yael [VerfasserIn]; Lynch, Nancy [VerfasserIn]; Musco, Cameron [VerfasserIn]; Parter, Merav [VerfasserIn]
  • Erschienen: Schloss Dagstuhl – Leibniz-Zentrum für Informatik, 2020
  • Sprache: Englisch
  • DOI: https://doi.org/10.4230/LIPIcs.ITCS.2020.23
  • Schlagwörter: biological distributed computing ; random projection ; dimensionality reduction ; spiking neural networks ; clustering ; compressed sensing ; winner-take-all
  • Entstehung:
  • Anmerkungen: Diese Datenquelle enthält auch Bestandsnachweise, die nicht zu einem Volltext führen.
  • Beschreibung: We study input compression in a biologically inspired model of neural computation. We demonstrate that a network consisting of a random projection step (implemented via random synaptic connectivity) followed by a sparsification step (implemented via winner-take-all competition) can reduce well-separated high-dimensional input vectors to well-separated low-dimensional vectors. By augmenting our network with a third module, we can efficiently map each input (along with any small perturbations of the input) to a unique representative neuron, solving a neural clustering problem. Both the size of our network and its processing time, i.e., the time it takes the network to compute the compressed output given a presented input, are independent of the (potentially large) dimension of the input patterns and depend only on the number of distinct inputs that the network must encode and the pairwise relative Hamming distance between these inputs. The first two steps of our construction mirror known biological networks, for example, in the fruit fly olfactory system [Caron et al., 2013; Lin et al., 2014; Dasgupta et al., 2017]. Our analysis helps provide a theoretical understanding of these networks and lay a foundation for how random compression and input memorization may be implemented in biological neural networks. Technically, a contribution in our network design is the implementation of a short-term memory. Our network can be given a desired memory time t_m as an input parameter and satisfies the following with high probability: any pattern presented several times within a time window of t_m rounds will be mapped to a single representative output neuron. However, a pattern not presented for c⋅t_m rounds for some constant c>1 will be "forgotten", and its representative output neuron will be released, to accommodate newly introduced patterns.
  • Zugangsstatus: Freier Zugang