Download PDFOpen PDF in browserA Self-Distillation Assisted ResNet-KL Image Classification NetworkEasyChair Preprint 106636 pages•Date: August 3, 2023AbstractTraditional ResNet models suffer from large model size and high computational complexity. In this study, we propose a self-distillation assisted ResNet-KL image classification method to address the low accuracy and efficiency issues in image classification tasks.Firstly,we introduce depthwise separable convolutions to the ResNet network and enhance the model's classification performance by improving the design of activation functions, using T-ReLU instead of traditional ReLU. Secondly,we enhance the model's perception of features at different scales by incorporating multi-scale convolutions for the fusion of residual layers and attention mechanism layers. To reduce the model's parameter count, we combine feature distillation with logic distillation and optimize the model layer by layer through self-distillation, while applying pruning techniques multiple times to reduce its size. Finally, To assess the efficacy of our methodology, we conduct experimental evaluations on public datasets CIFAR-10, CIFAR-100, and STL-10. The results show that the improved ResNet-KL network achieves an accuracy improvement of 1.65%, 2.72%, and 0.36% compared to traditional ResNet models on these datasets, respectively. Our method obtains better classification performance with the same computational resources, making it promising for applications in tasks such as object classification. Keyphrases: Pruning, ResNet, image classification, self-distillation
|