Özet
Stochastic computing (SC) is an emerging paradigm that offers hardware-efficient solutions for developing low-cost and noise-robust architectures. In SC, deterministic logic systems are employed along with bit-stream sources to process scalar values. However, using long bit-streams introduces challenges, such as increased latency and significant energy consumption. To address these issues, we present an optimization-oriented approach for modeling and sizing new logic gates, which results in optimal latency. The optimization process is automated using hardware-software cooperation by integrating Cadence and MATLAB environments. Initially, we optimize the circuit topology by leveraging the design parameters of two-input basic logic gates. This optimization is performed using a multiobjective approach based on a deep neural network. Subsequently, we employ the proposed gates to demonstrate favorable solutions targeting SC-based operations.
Orijinal dil | İngilizce |
---|---|
Sayfa (başlangıç-bitiş) | 190-193 |
Sayfa sayısı | 4 |
Dergi | IEEE Embedded Systems Letters |
Hacim | 15 |
Basın numarası | 4 |
DOI'lar | |
Yayın durumu | Yayınlandı - 1 Ara 2023 |
Bibliyografik not
Publisher Copyright:© 2009-2012 IEEE.
Finansman
This work was supported in part by the National Science Foundation (NSF) under Grant 2019511; by the Louisiana Board of Regents Support Fund under Grant LEQSF(2020-23)-RD-A-26.
Finansörler | Finansör numarası |
---|---|
Louisiana Board of Regents Support Fund | LEQSF(2020-23)-RD-A-26 |
National Science Foundation | 2019511 |