Skip to main content


Post-training Quantization for Edge TPU

In Tensorflow website, there is quite a bit of explanation for post-training quantization  but there is not much on transfer learning. The sample shown on Coral website is using Tensorflow 1.x and requires to execute the transfer learning inside a docker. In this blog post, I am going to demonstrate on how to perform post-training quantization using Tensorflow 2.0 for Mobilenet V1 and V2. All the steps can be performed on Colab notebook (thus making use of free GPU from Google, Thank you Google!!!). The steps are almost the same for both versions except at the base model I have changed the model. The tflite model is then converted to Edge TPU tflite model which can be used for realtime inferencing. For both the models, I am using the flower dataset to perform the transfer learning. Readers can use this as a base for another class of classification. In the future blog post, I may try more advanced models such as Inception, Resnet etc. A lot depends on the Edge TPU compiler because t
Recent posts

SiPeed AI at the Edge

Edge AI looks like going to be the next big things. In this blog post, I am going to introduce a very cheap and yet power AI chip based on Kendryte K210. K210 is based on open source RISC-V instructions set. According to Wikipedia ,  RISC-V  started in 2010 and in the recent 2 years, actual chips are produced with prototyping boards. A few China companies have started to build prototyping kits around this chip. One of these company is SiPeed . They have produced a few form factors of prototyping board and I have gotten hold of the Maixpy GO board. The Maixpy Go has the following features: 2.8 inch touch LCD Camera TF card slot Mic RGB LED Speaker WIFI Rechargeable battery Powered via USB C. The USB-C also act as Uart for transferring of codes and flashing of firmware. There are a few ways to program the kit, micropython, Arduino IDE or Platform IO. For micropython, there is Maixpy IDE which is a port of OpenMV IDE. Sipeed also has 2 interesting peripherals, m

Transfer Learning (Image Classification) in Colab for Edge TPU

Most visual deep learning applications used an existing model and performed transfer learning to classify the images or detect the objects within the image. To use the Coral USB accelerator for these operations, the model has to be converted to Tensorflow lite model and then to model understood by Edge TPU. There is a good article on model optimisation  but for this blog post, I am using the Quantization Aware approach for the transfer learning. In Google example, it is required to install a docker and perform the transfer learning within docker. In this blog post, I have moved the retraining process to Colab and only the final step of converting to Edge TPU format is done on the local machine.

Coral USB Accelerator first impression.

Google has released its edge AI devices notably the Coral Dev Board and the USB Accelerator recently and I managed to buy the USB accelerator from Seeed Studio before it was sold out. I find that there is a bit of irony that China is listed as one of the countries that have restricted export but my Accelerator was shipped out of Shenzhen and luckily I do not stay in one of the countries listed that has export restrictions. This is really a tiny piece of device. It is has a silver heat sink wrapped with rubbery plastic and comes with a USB 3.0 USB-C interface. I cannot imagine that this edge TPU only consumed about 2.5W but packed enough computation power to perform object detection and image classification in real-time. I decided to do my initial test on a Linux desktop that has USB 3 supports instead of Raspberry Pi in order to make sure that the USB port is not the bottleneck. Installation was brisk, I have followed the Get Started Guide and I managed to get the demo app up

Xiaomi Mi Flora Chinese and International version comparison

Xiaomi Miflora sensor allows monitoring of the surrounding environment of the plant. It can monitor temperature, soil moisture, conductivity (acidity of the soil), ambience light. There are 2 versions of the sensor, international and Chinese version. Realistically, I cannot see the difference between the 2 versions. The left-hand side is the Chinese version and the right is the international version. Other than the packaging, the main difference is the price of the sensor. The International version cost 2x more. In other forums, there are discussions that the Chinese version cannot connect to the MiHome App. I have set my MiHome app to connect to China server and it is able to register the sensor correctly. For my usage, I will be connecting to Openhab to monitor the plants andusing  Thomas Dietrich MiFlora mqtt daemon  as the bridge between Openhab and the sensor. The International version is detected as Flower care and the firmware version is 3.1.9 while the Chines

Smart Home Project (Updates)

The project has reached an operational stage with the following milestones. The original posts can be found at Starting a smart home project Smart Home Design V1 The Smart Home is controlled by Openhab2 using a Raspberry PI. After some initial instability, the system finally is able to be functioned smoothly. The main control panel is Habpanel and a sample of the main dashboard is as shown. Habpanel is one of the official UI supported by Openhab. It comes with some pre-defined widgets such as buttons, frames etc. For the pictures above, some of the widgets are custom widgets created by third parties such as the column of sliding switches, presence detection panel and the fan control panel. User can easily extend habpanel widgets because the widgets are written in AngularJS. There are 5 main types of communications protocols: 1. WIFI This is the main backbone used by all the devices. 2. Zigbee This is mainly used by Xiaomi sensors. 3. RF 433 MHz This is used by the