Boston University Libraries OpenBU
    JavaScript is disabled for your browser. Some features of this site may not work without it.
    View Item 
    •   OpenBU
    • BU Open Access Articles
    • BU Open Access Articles
    • View Item
    •   OpenBU
    • BU Open Access Articles
    • BU Open Access Articles
    • View Item

    Design space exploration of neural network activation function circuits

    Thumbnail
    Date Issued
    2019-10
    Publisher Version
    10.1109/tcad.2018.2871198
    Author(s)
    Yang, Tao
    Wei, Yadong
    Tu, Zhijun
    Zeng, Haolun
    Kinsy, Michel A.
    Zheng, Nanning
    Ren, Pengju
    Share to FacebookShare to TwitterShare by Email
    Export Citation
    Download to BibTex
    Download to EndNote/RefMan (RIS)
    Metadata
    Show full item record
    Permanent Link
    https://hdl.handle.net/2144/39100
    Version
    Accepted manuscript
    Citation (published version)
    Tao Yang, Yadong Wei, Zhijun Tu, Haolun Zeng, Michel A Kinsy, Nanning Zheng, Pengju Ren. 2019. "Design Space Exploration of Neural Network Activation Function Circuits." IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems, Volume 38, Issue 10, pp. 1974 - 1978. https://doi.org/10.1109/tcad.2018.2871198
    Abstract
    The widespread application of artificial neural networks has prompted researchers to experiment with field-programmable gate array and customized ASIC designs to speed up their computation. These implementation efforts have generally focused on weight multiplication and signal summation operations, and less on activation functions used in these applications. Yet, efficient hardware implementations of nonlinear activation functions like exponential linear units (ELU), scaled ELU (SELU), and hyperbolic tangent (tanh), are central to designing effective neural network accelerators, since these functions require lots of resources. In this paper, we explore efficient hardware implementations of activation functions using purely combinational circuits, with a focus on two widely used nonlinear activation functions, i.e., SELU and tanh. Our experiments demonstrate that neural networks are generally insensitive to the precision of the activation function. The results also prove that the proposed combinational circuit-based approach is very efficient in terms of speed and area, with negligible accuracy loss on the MNIST, CIFAR-10, and IMAGE NET benchmarks. Synopsys design compiler synthesis results show that circuit designs for tanh and SELU can save between ×3.13∼×7.69 and ×4.45∼×8.45 area compared to the look-up table/memory-based implementations, and can operate at 5.14 GHz and 4.52 GHz using the 28-nm SVT library, respectively. The implementation is available at: https://github.com/ThomasMrY/ActivationFunctionDemo.
    Rights
    Copyright notice: "© 2016 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works."
    Collections
    • ENG: Electrical and Computer Engineering: Scholarly Papers [385]
    • BU Open Access Articles [4833]


    Boston University
    Contact Us | Send Feedback | Help
     

     

    Browse

    All of OpenBUCommunities & CollectionsIssue DateAuthorsTitlesSubjectsThis CollectionIssue DateAuthorsTitlesSubjects

    Deposit Materials

    LoginNon-BU Registration

    Statistics

    Most Popular ItemsStatistics by CountryMost Popular Authors

    Boston University
    Contact Us | Send Feedback | Help