Layer-wise knowledge distillation for simplified bipolar morphological neural networks
Informacionnye tehnologii i vyčislitelnye sistemy, no. 3 (2023), pp. 46-54.

Voir la notice de l'article provenant de la source Math-Net.Ru

Various neuron approximations can be used to reduce the computational complexity of neural networks. One such approximation based on summation and maximum operations is a bipolar morphological neuron. This paper presents an improved structure of the bipolar morphological neuron that enhances its computational efficiency and a new approach to training based on continuous approximations of the maximum and knowledge distillation. Experiments were conducted on the MNIST dataset using a LeNet-like neural network architecture and on the CIFAR10 dataset using a ResNet-22 model architecture. The proposed training method achieves 99.45% classification accuracy on the LeNet-like model, with the same accuracy of the classical network, and 86.69% accuracy on the ResNet-22 model, compared to 86.43% accuracy of the classical model. The results show that the proposed method with logsum-exp (LSE) approximation of the maximum and layer-by-layer knowledge distillation, allows for a simplified bipolar morphological network that is not inferior to classical networks.
Mots-clés : bipolar morphological networks, approximations, artificial neural networks, computational efficiency.
@article{ITVS_2023_3_a4,
     author = {M. V. Zingerenko and E. E. Limonova},
     title = {Layer-wise knowledge distillation for simplified bipolar morphological neural networks},
     journal = {Informacionnye tehnologii i vy\v{c}islitelnye sistemy},
     pages = {46--54},
     publisher = {mathdoc},
     number = {3},
     year = {2023},
     language = {ru},
     url = {https://geodesic-test.mathdoc.fr/item/ITVS_2023_3_a4/}
}
TY  - JOUR
AU  - M. V. Zingerenko
AU  - E. E. Limonova
TI  - Layer-wise knowledge distillation for simplified bipolar morphological neural networks
JO  - Informacionnye tehnologii i vyčislitelnye sistemy
PY  - 2023
SP  - 46
EP  - 54
IS  - 3
PB  - mathdoc
UR  - https://geodesic-test.mathdoc.fr/item/ITVS_2023_3_a4/
LA  - ru
ID  - ITVS_2023_3_a4
ER  - 
%0 Journal Article
%A M. V. Zingerenko
%A E. E. Limonova
%T Layer-wise knowledge distillation for simplified bipolar morphological neural networks
%J Informacionnye tehnologii i vyčislitelnye sistemy
%D 2023
%P 46-54
%N 3
%I mathdoc
%U https://geodesic-test.mathdoc.fr/item/ITVS_2023_3_a4/
%G ru
%F ITVS_2023_3_a4
M. V. Zingerenko; E. E. Limonova. Layer-wise knowledge distillation for simplified bipolar morphological neural networks. Informacionnye tehnologii i vyčislitelnye sistemy, no. 3 (2023), pp. 46-54. https://geodesic-test.mathdoc.fr/item/ITVS_2023_3_a4/