NeST

NeST [DYJ19] targets sparse networks where connections are added incrementally. Justified by Hebbian theory, to choose which neuron to add, NeST looks at which neuron \(i\) in layer \(l-2\) activation and neuron \(j\) in layer \(l\) gradient are correlated. They add a neuron with a single non-zero fan-in weight and a single non-zero fan-out weight. NeST uses the same mechanism to densify layers except that they consider interaction between two successive layers, e.g. \(l-1\) and \(l\).

References

[DYJ19]

Xiaoliang Dai, Hongxu Yin, and Niraj K. Jha. NeST: A Neural Network Synthesis Tool Based on a Grow-and-Prune Paradigm. IEEE Transactions on Computers, 68(10):1487–1497, October 2019. doi:10.1109/TC.2019.2914438.