Content area

Abstract

The Deep Residual Network in Network (DrNIN) model [18] is an important extension of the convolutional neural network (CNN). They have proven capable of scaling up to dozens of layers. This model exploits a nonlinear function, to replace linear filter, for the convolution represented in the layers of multilayer perceptron (MLP) [23]. Increasing the depth of DrNIN can contribute to improved classification and detection accuracy. However, training the deep model becomes more difficult, the training time slows down, and a problem of decreasing feature reuse arises. To address these issues, in this paper, we conduct a detailed experimental study on the architecture of DrMLPconv blocks, based on which we present a new model that represents a wider model of DrNIN. In this model, we increase the width of the DrNINs and decrease the depth. We call the result module (WDrNIN). On the CIFAR-10 dataset, we will provide an experimental study showing that WDrNIN models can gain accuracy through increased width. Moreover, we demonstrate that even a single WDrNIN outperforms all network-based models in MLPconv network models in accuracy and efficiency with an accuracy equivalent to 93.553% for WDrNIN-4-2.

Details

Title
Wide deep residual networks in networks
Author
Alaeddine, Hmidi 1 ; Jihene, Malek 2 

 Monastir University, Faculty of Sciences of Monastir, Laboratory of Electronics and Microelectronics, LR99ES30, Monastir, Tunisia (GRID:grid.411838.7) (ISNI:0000 0004 0593 5040) 
 Monastir University, Faculty of Sciences of Monastir, Laboratory of Electronics and Microelectronics, LR99ES30, Monastir, Tunisia (GRID:grid.411838.7) (ISNI:0000 0004 0593 5040); Sousse University, Higher Institute of Applied Sciences and Technology of Sousse, Sousse, Tunisia (GRID:grid.7900.e) (ISNI:0000 0001 2114 4570) 
Pages
7889-7899
Publication year
2023
Publication date
Feb 2023
Publisher
Springer Nature B.V.
ISSN
13807501
e-ISSN
15737721
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
2769870211
Copyright
© The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature 2022. Springer Nature or its licensor holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.