雨中

BitNetMCU: Implementing MNIST handwritten digit recognition on the CH32V003 RISC-V MCU

High-precision low-bit quantized neural networks on low-end microcontrollers

 
Overview
BitNetMCU is a project focused on low-bit quantized neural network training and inference, specifically designed to run efficiently on low-end microcontrollers such as the CH32V003. Quantization-aware training (QAT) and fine-tuning of the model structure and inference code allow achieving over 99% test accuracy on the 16x16 MNIST dataset without using multiplication instructions and requiring only 2kb RAM and 16kb Flash. The training pipeline is based on PyTorch and can be run anywhere. The inference engine is implemented in Ansi-C and can be easily ported to any microcontroller. BitNetMCU is a project focused on the training and inference of low-bit quantized neural networks, specifically designed to run efficiently on low-end microcontrollers like the CH32V003. Quantization aware training (QAT) and fine-tuning of model structure and inference code allowed surpassing 99% Test accuracy on a 16x16 MNIST dataset without using multiplication instructions and in only 2kb of RAM and 16kb of Flash. The training pipeline is based on PyTorch and should run anywhere. The inference engine is implemented in Ansi-C and can be easily ported to any Microcontroller.
参考设计图片
×
Design Files
 
 
Search Datasheet?

Supported by EEWorld Datasheet

Forum More
Update:2025-06-03 15:24:23

EEWorld
subscription
account

EEWorld
service
account

Automotive
development
community

Robot
development
community

About Us Customer Service Contact Information Datasheet Sitemap LatestNews


Room 1530, 15th Floor, Building B, No.18 Zhongguancun Street, Haidian District, Beijing, Postal Code: 100190 China Telephone: 008610 8235 0740

Copyright © 2005-2024 EEWORLD.com.cn, Inc. All rights reserved 京ICP证060456号 京ICP备10001474号-1 电信业务审批[2006]字第258号函 京公网安备 11010802033920号