Muhammad Usman Jamal
Hardware Accelerators for Long short-term memory Neural Networks using High Level Synthesis (HLS).
Rel. Luciano Lavagno. Politecnico di Torino, Corso di laurea magistrale in Ingegneria Elettronica (Electronic Engineering), 2018
|
Preview |
PDF (Tesi_di_laurea)
- Tesi
Licenza: Creative Commons Attribution Non-commercial No Derivatives. Download (1MB) | Preview |
Abstract
Long short-term memory networks, referred as LSTMs, are a notable kind of recurrent neural networks. They allow you to overcome vanishing gradient problem. Some of the different applications of LSTM include speech recognition, handwriting generation and recognition, music generation and composition, etc. FPGA-based hardware accelerators have been used recently due to their good performance in terms of power and flexibility. In this thesis, hardware accelerators have been implemented, synthesized and optimized for LSTM with different data type and this is made possible by using Xilinx Vivado tool. The data types used are fixed point - 16, float and double. One of the bottlenecks faced during the synthesis is sigmoid activation function which is non-linear.
A piecewise linear approximation is used for sigmoid function to overcome this issue
Relatori
Anno Accademico
Tipo di pubblicazione
Numero di pagine
Corso di laurea
Classe di laurea
URI
![]() |
Modifica (riservato agli operatori) |
