Abstract
Bias temperature instability (BTI) is a time-based degradation mechanism that causes serious damage to the performance of analog and digital integrated circuits. The increasingly probabilistic nature of this phenomenon renders machine learning-based modeling approaches more advantageous, as they can deliver more accurate results in that context compared to analytical methods. In this paper, the Long Short-Term Memory (LSTM) method, a time-series approach, has been adopted to model BTI in 40 nm CMOS p-type metal-oxide-semiconductor field-effect transistors (MOSFETs). The aging model has been established by training the experimental data collected from a dedicated test chip. A bi-directional LSTM structure has been employed in model generation. Mean-square error (MSE) results indicate that the model can be effectively utilized in interpolation exercises where the test data falls within the same interval as the training data, with great accuracy. Moreover, the model has yielded promising outcomes in extrapolation exercises where the test data lies outside the defined training range. This property potentially qualifies the proposed approach for time-to-market and cost-reduction efforts.
| Original language | English |
|---|---|
| Article number | e70059 |
| Journal | International Journal of Numerical Modelling: Electronic Networks, Devices and Fields |
| Volume | 38 |
| Issue number | 3 |
| DOIs | |
| Publication status | Published - 1 May 2025 |
Bibliographical note
Publisher Copyright:© 2025 John Wiley & Sons Ltd.
Keywords
- LSTM
- Long Short-Term Memory
- ML
- NBTI
- Negative Bias Temperature Instability
- integrated circuits
- machine learning
- reliability