Tensor decompositions of higher-order correlations by nonlinear Hebbian learning
Date
2021-12-06
DOI
Authors
Ocker, Gabriel Koch
Buice, Michael A.
Version
OA Version
Citation
G. Ocker, M. Buice. 2021. "Tensor decompositions of higher-order correlations by nonlinear Hebbian learning." Neural Information Processing Systems
Abstract
Biological synaptic plasticity exhibits nonlinearities that are not accounted for by
classic Hebbian learning rules. Here, we introduce a simple family of generalized
nonlinear Hebbian learning rules. We study the computations implemented by their
dynamics in the simple setting of a neuron receiving feedforward inputs. These
nonlinear Hebbian rules allow a neuron to learn tensor decompositions of its higher-order
input correlations. The particular input correlation decomposed and the form
of the decomposition depend on the location of nonlinearities in the plasticity
rule. For simple, biologically motivated parameters, the neuron learns eigenvectors
of higher-order input correlation tensors. We prove that tensor eigenvectors are
attractors and determine their basins of attraction. We calculate the volume of those
basins, showing that the dominant eigenvector has the largest basin of attraction.
We then study arbitrary learning rules and find that any learning rule that admits a
finite Taylor expansion into the neural input and output also has stable equilibria at
generalized eigenvectors of higher-order input correlation tensors. Nonlinearities
in synaptic plasticity thus allow a neuron to encode higher-order input correlations
in a simple fashion.