Methods for pruning deep neural networks

Vadera, S ORCID: and Ameen, SA 2022, 'Methods for pruning deep neural networks' , IEEE Access , 63280- 63300.

PDF - Published Version
Available under License Creative Commons Attribution 4.0.

Download (1MB) | Preview
[img] PDF - Accepted Version
Restricted to Repository staff only

Download (621kB) | Request a copy


This paper presents a survey of methods for pruning deep neural networks. It begins by categorising over 150 studies based on the underlying approach used and then focuses on three categories: methods that use magnitude based pruning, methods that utilise clustering to identify redundancy, and methods that use sensitivity analysis to assess the effect of pruning. Some of the key influencing studies within these categories are presented to highlight the underlying approaches and results achieved. Most studies present results which are distributed in the literature as new architectures, algorithms and data sets have developed with time, making comparison across different studied difficult. The paper therefore provides a resource for the community that can be used to quickly compare the results from many different methods on a variety of data sets, and a range of architectures, including AlexNet, ResNet, DenseNet and VGG. The resource is illustrated by comparing the results published for pruning AlexNet and ResNet50 on ImageNet and ResNet56 and VGG16 on the CIFAR10 data to reveal which pruning methods work well in terms of retaining accuracy whilst achieving good compression rates. The paper concludes by identifying some research gaps and promising directions for future research.

Item Type: Article
Schools: Schools > School of Computing, Science and Engineering > Salford Innovation Research Centre
Journal or Publication Title: IEEE Access
Publisher: IEEE
ISSN: 2169-3536
Related URLs:
Depositing User: USIR Admin
Date Deposited: 07 Jun 2022 08:01
Last Modified: 22 Aug 2022 13:00

Actions (login required)

Edit record (repository staff only) Edit record (repository staff only)


Downloads per month over past year