0% Complete
Home
/
14th International Conference on Computer and Knowledge Engineering
Non-Negative Matrix Factorization improves Residual Neural Networks
Authors :
Hojjat Moayed
1
1- Esfarayen University of Technology
Keywords :
ResNet،Residual Neural Network،NMF،Deep Learning
Abstract :
Residual neural networks enable the use of very deep architectures. These architectures benefit by passing the identity information from a layer directly to subsequent layers. Extensive research has been conducted to improve the performance of residual neural networks. In this paper, we propose a method to improve performance by providing a more informative input using non-negative matrix factorization. The method combines the invariant features learned from the training data with the extracted, fine-tuned features at the end of the residual block. Our experimental results confirm that the proposed architecture improves performance on the image classification task.
Papers List
List of archived papers
Spatial-channel attention-based stochastic neighboring embedding pooling and long short term memory for lung nodules classification
AHMED SAIHOOD - HOSSEIN KARSHENAS - AHMADREZA NAGHSH NILCHI
Spatio-Temporal Graph Neural Networks for Accurate Crime Prediction
Rojan Roshankar - Mohammad Reza Keyvanpour
ExaAEC: A New Multi-label Emotion Classification Corpus in Arabic Tweets
Saeed Sarbazi-Azad - Ahmad Akbari - Mohsen Khazeni
Delay Optimization of a Federated Learning-based UAV-aided IoT network
Hossein Mohammadi Firouzjaei - Javad Zeraatkar Moghaddam - Mehrdad Ardebilipour
Distilled BERT Model In Natural Language Processing
Yazdan Zandiye Vakili - Avisa Fallah - Hedieh Sajedi
Pruning and Mixed Precision Techniques for Accelerating Neural Network
Mahsa Zahedi - Mohammad Sediq Abazari Bozhgani - Abdorreza Savadi
Histopathology Image-Based Cancer Classification Utilizing Transfer Learning Approach
Amir Meydani - Alireza Meidani - Ali Ramezani - Maryam Shabani - Mohammad Mehdi Kazeminasab - Shahriar Shahablavasani
Efficient Object Detection using Deep Reinforcement Learning and Capsule Networks
Sobhan Siamak - Eghbal Mansoori
Towards Efficient Capsule Networks through Approximate Squash Function and Layer-wise Quantization
Mohsen Raji - Kimia Soroush - Amir Ghazizadeh
MIPS-Core Application Specific Instruction-Set Processor for IDEA Cryptography − Comparison between Single-Cycle and Multi-Cycle Architectures
Ahmad Ahmadi - Reza Faghih Mirzaee
more
Samin Hamayesh - Version 42.2.1