0% Complete
Home
/
13th International Conference on Computer and Knowledge Engineering
Enhancing Lighter Neural Network Performance with Layer-wise Knowledge Distillation and Selective Pixel Attention
Authors :
Siavash Zaravashan
1
Sajjad Torabi
2
Hesam Zaravashan
3
1- Iran University of Science and Technology
2- Part AI Research Center
3- Iran University of Science and Technology
Keywords :
Knowledge distillation،Image classification،Deep learning،Model compression،Computer vision
Abstract :
Deep neural networks have made a revolution in different areas of Artificial Intelligence such as image classification and object detection. However, most of these models are heavy and time-consuming at the inference stage. If we desire to use real-time applications, it's necessary to lighten them up. However, this could negatively affect the performance of models. One suggested approach for enhancing the capabilities of smaller models involves a technique called knowledge distillation. In this method, a less advanced model, referred to as the student model, learns from and emulates the more advanced teacher model to enhance its performance. This method has been proven as an effective technique to compress the model and increase its accuracy. Previous methods mostly focus on proposing feature transformation and loss functions between the same level’s features to improve the effectiveness. In this paper, we investigate the effect of connections between different layers of teacher and student networks and reveal their great importance. We have employed three different loss functions—attention-driven distillation, hierarchical context loss, and non-local loss—to enhance the classifier's overall performance. Extensive experiments on the CIFAR-100 dataset show that the proposed method has a great performance improvement. For the benefit of the research community, we will make the code for this study available on GitHub.
Papers List
List of archived papers
TCAR: Thermal and Congestion-Aware Routing Algorithm in a Partially Connected 3D Network on Chip
Majid Nezarat - Masoomeh Momeni
Enhanced Principal-curve based Classifiers for Time-series Label Prediction
Seyed Aref Hakimzadeh - Koorush Ziarati
SingAll: Scalable Control Flow Checking for Multi-Process Embedded Systems
Mehdi Amininasab - Ahmad Patooghy - Mahdi Fazeli
Introducing E4MT and LMBNC: Persian pre-processing utilities
Zakieh Shakeri - Mehran Ziabary - Behrooz Vedadian - Fatemeh Azadi - Saeed Torabzadeh - Arian Atefi
Intracranial Hemorrhage Classification using CBAM Attention Module and Convolutional Neural Networks
Parnian Rahimi - Marjan Naderan - Amir Jamshidnezhad - Shahram Rafie
FAST: FPGA Acceleration of Neural Networks Training
Alireza Borhani - Mohammad Hossein Goharinejad - Hamid Reza Zarandi
Using Deep Learning for Classification of Lung Cancer on CT Images in Ardabil Province
Mohammad Ali Javadzadeh Barzaki - Jafar Abdollahi - Mohammad Negaresh - Maryam Salimi - Hadi Zolfeghari - Mohsen Mohammadi - Asma Salmani - Rona Jannati - Firouz Amani
An interactive user groups recommender system based on reinforcement learning
Hediyeh Naderi Allaf - Mohsen Kahani
A Novel Method For Fake News Detection Based on Propagation Tree
Mansour Davoudi - Mohammad Reza Moosavi - Mohammad Hadi Sadreddini
Enhancing Vehicle Make and Model Recognition with 3D Attention Modules
Narges Semiromizadeh - Omid Nejati Manzari - Shahriar B. Shokouhi - Sattar Mirzakuchaki
more
Samin Hamayesh - Version 41.5.3