Skeletonization: A Technique for Trimming the Fat from a Network via Relevance Assessment.  A Simple Procedure for Pruning Back-Propagation Trained Neural Networks.  Learning Sparse Neural Networks through L_0 Regularization.  Generalized Dropout.  Variational Dropout Sparsifies Deep Neural Networks.  ON THE IMPORTANCE OF SINGLE DIRECTIONS FOR GENERALIZATION  Ablation of a Robot’s Brain: Neural Networks Under a Knife How Important Is a Neuron?  SNIP: SINGLE-SHOT NETWORK PRUNING BASED ON CONNECTION SENSITIVITY  How Important Is a Neuron ?
This is an example citation of Brothman’s 1991 article.