An Efficient and Fast Softmax Hardware Architecture (EFSHA) for Deep Neural Networks

Muhammad Awais Hussain, Tsung Han Tsai

研究成果: 書貢獻/報告類型會議論文篇章同行評審

12 引文 斯高帕斯(Scopus)

摘要

Deep neural networks are widely used in computer vision applications due to their high performance. However, DNNs involve a large number of computations in the training and inference phase. Among the different layers of a DNN, the softmax layer has one of the most complex computations as it involves exponent and division operations. So, a hardware-efficient implementation is required to reduce the on-chip resources. In this paper, we propose a new hardware-efficient and fast implementation of the softmax activation function. The proposed hardware implementation consumes fewer hardware resources and works at high speed as compared to the state-of-the-art techniques.

原文???core.languages.en_GB???
主出版物標題2021 IEEE 3rd International Conference on Artificial Intelligence Circuits and Systems, AICAS 2021
發行者Institute of Electrical and Electronics Engineers Inc.
ISBN(電子)9781665419130
DOIs
出版狀態已出版 - 6 6月 2021
事件3rd IEEE International Conference on Artificial Intelligence Circuits and Systems, AICAS 2021 - Washington, United States
持續時間: 6 6月 20219 6月 2021

出版系列

名字2021 IEEE 3rd International Conference on Artificial Intelligence Circuits and Systems, AICAS 2021

???event.eventtypes.event.conference???

???event.eventtypes.event.conference???3rd IEEE International Conference on Artificial Intelligence Circuits and Systems, AICAS 2021
國家/地區United States
城市Washington
期間6/06/219/06/21

指紋

深入研究「An Efficient and Fast Softmax Hardware Architecture (EFSHA) for Deep Neural Networks」主題。共同形成了獨特的指紋。

引用此