0% Complete
Home
/
12th International Conference on Computer and Knowledge Engineering
GAP: Fault tolerance Improvement of Convolutional Neural Networks through GAN-aided Pruning
Authors :
Pouya Hosseinzadeh
1
Yasser Sedaghat
2
Ahad Harati
3
1- Department of Computer Engineering, Ferdowsi University of Mashhad, Mashhad, Iran
2- Department of Computer Engineering, Ferdowsi University of Mashhad, Mashhad, Iran
3- Department of Computer Engineering, Ferdowsi University of Mashhad, Mashhad, Iran
Keywords :
CNN،fault tolerance،auxiliary pruning task،generalization،GAN،N-version programming،ensemble learning،bit-flip fault
Abstract :
Functionality and accuracy degradation imposed by Bit-Flip faults (BF) on the CNNs' weights makes it vital first to evaluate neural networks' behavior in the presence of faults in the design phase and then make use of these models, especially in safety-critical applications like autonomous vehicles. In the following paper, we propose a novel approach to improve the fault tolerance of convolutional neural networks through pruning their parameters with regard to their gradients instead of the more commonly used norm values. In particular, the gradient of the network's parameters on an auxiliary classification task based on Generative Adversarial Network (GAN) is exploited to identify parameters of lower generalization. Due to the inherent efficiency that all pruning techniques introduce into the model, it is practical to use the ensembled of pruned models. In this paper, we evaluate the functionality of the naïve ensemble versus the ensemble of three differently pruned models. This idea comes from the N-version Programming (NVP) concept of dependability literature. To assess the resiliency of our model, we conduct comprehensive random bit-flip fault injection experiments. Comparing the result of our approach to ordinary pruning techniques, we show that the classification accuracy is improved by 20% in the presence of a single fault in the network. Making an NVP ensemble of three VGG-16 GAN-aided pruned networks, we make a 10% further accuracy improvement in the presence of a single random fault in each base-learner for the CIFAR-10 dataset.
Papers List
List of archived papers
Word-level Persian Lipreading Dataset
Javad Peymanfard - Ali Lashini - Samin Heydarian - Hossein Zeinali - Nasser Mozayani
Optimizing Question-Answering Framework Through Integration of Text Summarization Model and Third-Generation Generative Pre-Trained Transformer
Ervin Gubin Moung - Toh Sin Tong - Maisarah Mohd Sufian - Valentino Liaw - Ali Farzamnia - Farashazillah Yahya
Lempel-Ziv-based Hyper-Heuristic Solution for Longest Common Subsequence Problem
Mahdi Nasrollahi - Reza Shami Tanha - Mohsen Hooshmand
CSI-Based Human Activity Recognition using Convolutional Neural Networks
Parisa Fard Moshiri - Mohammad Nabati - Reza Shahbazian - Seyed Ali Ghorashi
Pruning and Mixed Precision Techniques for Accelerating Neural Network
Mahsa Zahedi - Mohammad Sediq Abazari Bozhgani - Abdorreza Savadi
Standardized ReACT Logits: An Effective Approach for Anomaly Segmentation in Self-driving Cars
Mahdi Farhadi - Seyede Mahya Hazavei - Shahriar Baradaran Shokouhi
An Ensemble CNN for Brain Age Estimation based on Hippocampal Region Applicable to Alzheimer's Diagnosis
Zahra Qodrati - Seyedeh Masoumeh Taji - Habibollah Danyali - Kamran Kazemi
Efficient Prediction of Cardiovascular Disease via Extra Tree Feature Selection
Mina Abroodi - Mohammad Reza Keyvanpour - Ghazaleh Kakavand Teimoory
Intensity-Image Reconstruction Using Event Camera Data by Changing in LSTM Update
Arezoo Rahmati Soltangholi - Ahad Harati - Abedin Vahedian
Analysis of Address Lifespans in Bitcoin and Ethereum
Amir Mohammad Karimi Mamaghan - Amin Setayesh - Behnam Bahrak
more
Samin Hamayesh - Version 41.7.6