References:
[1] Skeletonization: A Technique for Trimming the Fat from a Network via Relevance Assessment. [2] A Simple Procedure for Pruning Back-Propagation Trained Neural Networks. [3] Learning Sparse Neural Networks through L_0 Regularization. [4] Generalized Dropout. [5] Variational Dropout Sparsifies Deep Neural Networks. [6] ON THE IMPORTANCE OF SINGLE DIRECTIONS FOR GENERALIZATION [7] Ablation of a Robot’s Brain: Neural Networks Under a Knife How Important Is a Neuron? [8] SNIP: SINGLE-SHOT NETWORK PRUNING BASED ON CONNECTION SENSITIVITY [9] How Important Is a Neuron ?
This is an example citation of Brothman’s 1991 article.