0% Complete
Home
/
14th International Conference on Computer and Knowledge Engineering
Towards Efficient Capsule Networks through Approximate Squash Function and Layer-wise Quantization
Authors :
Mohsen Raji
1
Kimia Soroush
2
Amir Ghazizadeh
3
1- Shiraz university
2- Shiraz university
3- Shiraz university
Keywords :
Capsule Networks،Accelerated Squash Function،Post-Training Quantization،Quantization optimization
Abstract :
Capsule networks (CapsNets) have emerged as a promising architecture for various machine learning tasks due to their ability to capture hierarchical relationships within data. However, this structure has computationally intensive operations, particularly in the squash function, which involves square root calculations. In addition, they consume a lot of memory due to the high number of parameters, which makes them difficult to deploy on resource-constrained devices. In this paper, we take advantages of approximate computing and quantization to improve the efficiency of CapsNets performance. An approximate squash function is proposed using on the Fast Inverse Square Root (FISR) algorithm to accelerate square root operations, offering a remarkable speedup of up to 4 times compared to conventional methods. Additionally, we propose a novel algorithm called Least Sensitive Layer First (LSLF) in order to reduce the memory consumption of CapsNets. LSLF prioritizes aggressive quantization of the most error-tolerant layers while moderately quantizing the least sensitive layers against quantization error. Our experimental results demonstrate the effectiveness LSLF in enhancing the efficiency and performance of CapsNets, paving the way for more scalable and resource-efficient deep learning systems.
Papers List
List of archived papers
Exploring 3D Transfer Learning CNN Models for Alzheimer’s Disease Diagnosis from MRI Images
Fatemehsadat Ghanadi Ladani - Hamidreza Baradaran Kashani
DPRNN-FORMER: AN EFFICIENT WAY TO DEAL WITH BLIND SOURCE SEPARATION
Ramin Ghorbani - Sajad Haghzad Klidbary
Dynamic Knowledge Enhanced Neural Fashion Trend Forecasting with Quantile Loss
Fatemeh Rooholamini - Reza Azmi - Mobina Khademhossein - Maral Zarvani
ExaASC: A General Target-Based Stance Detection Corpus in Arabic Language
Mohammad Mehdi Jaziriyan - Ahmad Akbari - Hamed Karbasi
Adaptive-A-GCRNN: Enhancing Real-time Multi-band Spectrum Prediction through Attention-based Spatial-Temporal Modeling
Seyed majid Hosseini - Seyedeh Mozhgan Rahmatinia - Seyed Amin Hosseini Seno - Hadi Sadoghi yazdi
Multi-Task Transformer for Stock Market Trend Prediction
Seyed Morteza Mirjebreili - Ata Solouki - Hamidreza Soltanalizadeh - Mohammad Sabokrou
Impossible differential and zero-correlatin linear cryptanalysis of Marx, Marx2, Chaskey andSpeck32
Mahshid Saberi - Nasour Bagheri - Sadegh Sadeghi
Traffic Sign Recognition Using Local Vision Transformer
Ali Farzipour - Omid Nejati Manzari - Shahriar B. Shokouhi
PowerLinear Activation Functions with application to the first layer of CNNs
Kamyar Nasiri - Kamaledin Ghiasi-Shirazi
Camouflage Object Segmentation with Attention-Guided Pix2Pix and Boundary Awareness
Erfan Akbarnezhad Sany - Fatemeh Naserizadeh - Parsa Sinichi - Seyyed Abed Hosseini
more
Samin Hamayesh - Version 41.3.1