I have created a Colab notebook to perform transfer learning using Mobilenetv1 and then converts the model from h5 to tflite and then to kmodel. The kmodel file can be downloaded the Maixpy Go for realtime image classification.
Edge AI looks like going to be the next big things. In this blog post, I am going to introduce a very cheap and yet power AI chip based on Kendryte K210. K210 is based on open source RISC-V instructions set. According to Wikipedia, RISC-V started in 2010 and in the recent 2 years, actual chips are produced with prototyping boards.
A few China companies have started to build prototyping kits around this chip. One of these company is SiPeed. They have produced a few form factors of prototyping board and I have gotten hold of the Maixpy GO board.
The Maixpy Go has the following features:
2.8 inch touch LCDCameraTF card slotMicRGB LEDSpeakerWIFIRechargeable batteryPowered via USB C. The USB-C also act as Uart for transferring of codes and flashing of firmware.
There are a few ways to program the kit, micropython, Arduino IDE or Platform IO. For micropython, there is Maixpy IDE which is a port of OpenMV IDE.
Sipeed also has 2 interesting peripherals, microphone array and binocular camera. …
Most visual deep learning applications used an existing model and performed transfer learning to classify the images or detect the objects within the image. To use the Coral USB accelerator for these operations, the model has to be converted to Tensorflow lite model and then to model understood by Edge TPU.
There is a good article on model optimisation but for this blog post, I am using the Quantization Aware approach for the transfer learning. In Google example, it is required to install a docker and perform the transfer learning within docker. In this blog post, I have moved the retraining process to Colab and only the final step of converting to Edge TPU format is done on the local machine.