Abstract
High latency and power consumption are two major problems that need to be addressed in convolutional neural networks (CNN). In this paper, the convolutional layer is replaced with a discrete-time cellular neural network (CellNN) to overcome these problems. Multiple configurations of CellNNs are trained in a framework called TensorFlow to classify objects from the CIFAR-10 database. Effects of the number of iterations, the number of channels, batch normalization, and activation functions on the classification accuracies are presented. It is shown that TensorFlow is a tool that is capable of training discrete-time CellNNs. Although the accuracies of the proposed networks on CIFAR-10 are slightly lesser than the existing CNNs, with reduced parameters and multiply-accumulates (MACs), power consumption and computation time of our networks will be less than CNNs.
Original language | English |
---|---|
Pages (from-to) | 4171-4178 |
Number of pages | 8 |
Journal | International Journal of Circuit Theory and Applications |
Volume | 50 |
Issue number | 11 |
DOIs | |
Publication status | Published - Nov 2022 |
Bibliographical note
Publisher Copyright:© 2022 John Wiley & Sons Ltd.
Keywords
- cellular neural network
- convolutional neural network
- deep learning
- image processing