Foto 7

M. Cococcioni, F. Rossi, E. Ruffaldi, and S. Saponara. "A fast approximation of the hyperbolic tangent when using posit numbers and its application to deep neural networks", (ApplePies’19), 2019.

Written by

Deep Neural Networks (DNNs) are being used in more and more fields. Among the others, automotive is a field where deep neural networks are being exploited the most. An important aspect to be considered is the real-time constraint that this kind of applications put on neural network architectures. This poses the need for fast and hardware-friendly information representation. The recently proposed Posit format has been proved to be extremely efficient as a low-bit replacement of traditional floats. Its format has already allowed to construct a fast approximation of the sigmoid function, an activation function frequently used in DNNs. In this paper we present a fast approximation of another activation function widely used in DNNs: the hyperbolic tangent. In the experiment, we show how the approximated hyperbolic function outperforms the approximated sigmoid counterpart. The implication is clear: the posit format shows itself to be again DNN friendly, with important outcomes.

Keywords: {Deep Neural Networks (DNNs), Posit, Activation functions}

File: https://www.researchgate.net/publication/338886541_A_Fast_Approximation_of_the_Hyperbolic_Tangent_when_Using_Posit_Numbers_and_its_Application_to_Deep_Neural_Networks