MOGNET: A mux-residual quantized network leveraging online-generated weights - CEA - Commissariat à l’énergie atomique et aux énergies alternatives Accéder directement au contenu
Communication Dans Un Congrès Année : 2022

MOGNET: A mux-residual quantized network leveraging online-generated weights

Résumé

This paper presents a compact model architecture called MOGNET, compatible with a resource-limited hardware. MOGNET uses a streamlined Convolutional factorization block based on a combination of 2 point-wise (1×1) convolutions with a group-wise convolution in-between. To further limit the overall model size and reduce the on-chip required memory, the second point-wise convolution's parameters are on-line generated by a Cellular Automaton structure. In addition, MOGNET enables the use of low-precision weights and activations, by taking advantage of a Multiplexer mechanism with a proper Bitshift rescaling for integrating residual paths without increasing the hardware- related complexity. To efficiently train this model we also introduce a novel weight ternarization method favoring the balance between quantized levels. Experimental results show that given tiny memory budget (sub-2Mb), MOGNET can achieve higher accuracy with a clear gap up to 1% at a similar or even lower model size compared to recent state-of-the-art methods.
Fichier principal
Vignette du fichier
AICAS_Final_Version.pdf (1.3 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

cea-04556172 , version 1 (23-04-2024)

Identifiants

Citer

Van Thien Nguyen, William Guicquero, Gilles Sicard. MOGNET: A mux-residual quantized network leveraging online-generated weights. AICAS 2022 - IEEE 4th International Conference on Artificial Intelligence Circuits and Systems, Jun 2022, Incheon, South Korea. pp.90-93, ⟨10.1109/AICAS54282.2022.9869933⟩. ⟨cea-04556172⟩
0 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More