Unsupervised pre-trained filter learning approach for efficient convolution neural network

Rehman, S, Tu, S ORCID: https://orcid.org/0000-0002-4449-1708, Waqas, M, Huang, Y, Rehman, O, Ahmad, B and Ahmad, S 2019, 'Unsupervised pre-trained filter learning approach for efficient convolution neural network' , Neurocomputing, 365 , pp. 171-190.

[img] PDF - Published Version
Restricted to Repository staff only

Download (3MB) | Request a copy


The concept of Convolution Neural Network (ConvNet or CNN) is evaluated from the animal visual cortex. Since humans can learn through experience, similarly, ConvNet changes its weight accordingly to accomplish the desired output through backpropagation. In this paper, we provide a comprehensive survey of the relationship between ConvNet with different pre-trained learning methodologies and its optimization effects. These hybrid networks further develop the state-of-the-art algorithms in recognition, classification, and detection of images, speeches, texts, and videos. Furthermore, some task-specific applications of ConvNet have been introduced in computer vision. To validate the survey, we also perform some experiments on a public face and skin detection dataset to provide an authentic solution. The experimental results on the benchmark dataset highlight the merit of efficient pre-trained learning algorithms for optimized ConvNet. To motivate the follow-up research, we identify open problems and present future directions with regards to optimized ConvNet system design parameters and unsupervised learning.

Item Type: Article
Schools: Schools > School of Computing, Science and Engineering
Journal or Publication Title: Neurocomputing
Publisher: Elsevier
ISSN: 0925-2312
Funders: National Natural Science Foundation of China
Depositing User: S Rehman
Date Deposited: 31 Aug 2022 10:12
Last Modified: 31 Aug 2022 10:15
URI: https://usir.salford.ac.uk/id/eprint/64585

Actions (login required)

Edit record (repository staff only) Edit record (repository staff only)


Downloads per month over past year