IDEAS home Printed from https://ideas.repec.org/a/gam/jmathe/v10y2022i19p3435-d921130.html
   My bibliography  Save this article

Two Novel Non-Uniform Quantizers with Application in Post-Training Quantization

Author

Listed:
  • Zoran Perić

    (Faculty of Electronic Engineering, University of Nis, Aleksandra Medvedeva 14, 18000 Nis, Serbia)

  • Danijela Aleksić

    (Department of Mobile Network Nis, Telekom Srbija, Vozdova 11, 18000 Nis, Serbia)

  • Jelena Nikolić

    (Faculty of Electronic Engineering, University of Nis, Aleksandra Medvedeva 14, 18000 Nis, Serbia)

  • Stefan Tomić

    (School of Engineering and Technology, Al Dar University College, Dubai P.O. Box 35529, United Arab Emirates)

Abstract

With increased network downsizing and cost minimization in deployment of neural network (NN) models, the utilization of edge computing takes a significant place in modern artificial intelligence today. To bridge the memory constraints of less-capable edge systems, a plethora of quantizer models and quantization techniques are proposed for NN compression with the goal of enabling the fitting of the quantized NN (QNN) on the edge device and guaranteeing a high extent of accuracy preservation. NN compression by means of post-training quantization has attracted a lot of research attention, where the efficiency of uniform quantizers (UQs) has been promoted and heavily exploited. In this paper, we propose two novel non-uniform quantizers (NUQs) that prudently utilize one of the two properties of the simplest UQ. Although having the same quantization rule for specifying the support region, both NUQs have a different starting setting in terms of cell width, compared to a standard UQ. The first quantizer, named the simplest power-of-two quantizer (SPTQ), defines the width of cells that are multiplied by the power of two. As it is the case in the simplest UQ design, the representation levels of SPTQ are midpoints of the quantization cells. The second quantizer, named the modified SPTQ (MSPTQ), is a more competitive quantizer model, representing an enhanced version of SPTQ in which the quantizer decision thresholds are centered between the nearest representation levels, similar to the UQ design. These properties make the novel NUQs relatively simple. Unlike UQ, the quantization cells of MSPTQ are not of equal widths and the representation levels are not midpoints of the quantization cells. In this paper, we describe the design procedure of SPTQ and MSPTQ and we perform their optimization for the assumed Laplacian source. Afterwards, we perform post-training quantization by implementing SPTQ and MSPTQ, study the viability of QNN accuracy and show the implementation benefits over the case where UQ of an equal number of quantization cells is utilized in QNN for the same classification task. We believe that both NUQs are particularly substantial for memory-constrained environments, where simple and acceptably accurate solutions are of crucial importance.

Suggested Citation

  • Zoran Perić & Danijela Aleksić & Jelena Nikolić & Stefan Tomić, 2022. "Two Novel Non-Uniform Quantizers with Application in Post-Training Quantization," Mathematics, MDPI, vol. 10(19), pages 1-21, September.
  • Handle: RePEc:gam:jmathe:v:10:y:2022:i:19:p:3435-:d:921130
    as

    Download full text from publisher

    File URL: https://www.mdpi.com/2227-7390/10/19/3435/pdf
    Download Restriction: no

    File URL: https://www.mdpi.com/2227-7390/10/19/3435/
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Jelena Nikolić & Danijela Aleksić & Zoran Perić & Milan Dinčić, 2021. "Iterative Algorithm for Parameterization of Two-Region Piecewise Uniform Quantizer for the Laplacian Source," Mathematics, MDPI, vol. 9(23), pages 1-14, November.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.

      Corrections

      All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jmathe:v:10:y:2022:i:19:p:3435-:d:921130. See general information about how to correct material in RePEc.

      If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

      If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

      If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

      For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .

      Please note that corrections may take a couple of weeks to filter through the various RePEc services.

      IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.