0% Complete
Home
/
13th International Conference on Computer and Knowledge Engineering
Enhancing Lighter Neural Network Performance with Layer-wise Knowledge Distillation and Selective Pixel Attention
Authors :
Siavash Zaravashan
1
Sajjad Torabi
2
Hesam Zaravashan
3
1- Iran University of Science and Technology
2- Part AI Research Center
3- Iran University of Science and Technology
Keywords :
Knowledge distillation،Image classification،Deep learning،Model compression،Computer vision
Abstract :
Deep neural networks have made a revolution in different areas of Artificial Intelligence such as image classification and object detection. However, most of these models are heavy and time-consuming at the inference stage. If we desire to use real-time applications, it's necessary to lighten them up. However, this could negatively affect the performance of models. One suggested approach for enhancing the capabilities of smaller models involves a technique called knowledge distillation. In this method, a less advanced model, referred to as the student model, learns from and emulates the more advanced teacher model to enhance its performance. This method has been proven as an effective technique to compress the model and increase its accuracy. Previous methods mostly focus on proposing feature transformation and loss functions between the same level’s features to improve the effectiveness. In this paper, we investigate the effect of connections between different layers of teacher and student networks and reveal their great importance. We have employed three different loss functions—attention-driven distillation, hierarchical context loss, and non-local loss—to enhance the classifier's overall performance. Extensive experiments on the CIFAR-100 dataset show that the proposed method has a great performance improvement. For the benefit of the research community, we will make the code for this study available on GitHub.
Papers List
List of archived papers
Islamic Geometric algorithms: A survey
Elham Akbari - Azam Bastanfard
Blind Load-Balancing Algorithm using Double-Q-learning in the Fog Environment
Niloofar Tahmasebi pouya - Mehdi Agha Sarram
An Overview of Regression Methods in Early Prediction of Movie Ratings
Houmaan Chamani - Zhivar Sourati Hassanzadeh - Behnam Bahrak
Distilling Knowledge from CNN-Transformer Models for Enhanced Human Action Recognition
Hamid Ahmadabadi - Omid Nejati Manzari - Ahmad Ayatollahi
An Analysis of Botnet Detection Using Graph Neural Network
Faezeh Alizadeh - Mohammad Khansari
A Systematic Embedded Software Design Flow for Robotic Applications
Navid Mahdian - Seyed-Hosein Attarzadeh-Niaki - Armin Salimi-Badr
T-Rank: Graph Data Analytics for Urban Traffic Modeling
Alireza Safarpour - Iman Gholampour - Amirhossain Aghazadeh Fard - Seyed Mohammad Karbasi
New Design of Efficient Reversible Quantum Saturation Adder
Negin Mashayekhi - Mohammad Reza Reshadinezhad - Shekoofeh Moghimi
UAV-based Firefighting by Multi-agent Reinforcement Learning
Reza Shami Tanha - Mohsen Hooshmand - Mohsen Afsharchi
Underwater Image Super-Resolution using Generative Adversarial Network-based Model
Alireza Aghelan - Modjtaba Rouhani
more
Samin Hamayesh - Version 42.2.1