Why MLTK?¶
The MLTK can be thought of as a collection of Python scripts that simplify the development of embedded machine learning models using Google Tensorflow and Tensorflow-Lite Micro.
So, why use the MLTK instead of directly using Tensorflow?
Only a single python script and command-line is needed¶
To create an ML model using the MLTK, all you need is a single Python script and a few commands.
Everything needed to train, evaluate, quantize, profile, summarize, visualize is defined is a single Python script called the model specification file. This file is then invoked by the various MLTK commands.
Comparison with other solutions¶
Other projects that directly use the Tensorflow / Keras API can be much more convoluted.
Consider the Tensorflow-Lite Micro Examples. To train the models for these examples requires numerous different python scripts and advanced knowledge of Tensorflow. A comparison between the MLTK and TFLM examples can be seen in the following links:
Example Name | MLTK Solution | Tensorflow-Lite Micro Solution |
---|---|---|
micro_speech | tflite_micro_speech.py | micro_speech/train |
magic_wand | tflite_micro_magic_wand.py | magic_wand/train |
Another example is the TinyML Benchmark examples. These also require numerous different Python scripts and advanced Tensorflow knowledge to train the models:
Example Name | MLTK Solution | TinyML Benchmark Solution |
---|---|---|
anomaly_detection | anomaly_detection.py | anomaly_detection |
image_classification | image_classification.py | image_classification |
keyword_spotting | keyword_spotting.py | keyword_spotting |
visual_wake_words | visual_wake_words.py | visual_wake_words |
Lots of tools, all optional¶
The MLTK offers several different tools/utilities to analyze/modify the model files. Each of these tools are optional and independent from one another.
Additionally, many of the tools support .tflite
model files that were generated outside of the MLTK (i.e. the .tflite
model file need not be generated by the MLTK).
Tool Name | External .tflite Supported? | Description |
---|---|---|
Model Profiler | Yes | Calculate model run-time statistics like Latency, RAM usage, etc |
Model Summary | Yes | Generate text summary of model |
Model Visualizer | Yes | View model in interative graph |
Model Parameters | Yes | Embed parameters into .tflite model file |
Model Trainer | No | Train model using Tensorflow |
Model Evaluater | No | Evaluate accuracy of .tflite |
Model Quantizer | No | Quantize model using Tensorflow-Lite Converter |
Audio Visualizer | N/A | View generated spectrograms in real-time |
Audio Classifier | No | Classify real-time audio from a microphone |
C++ Python wrappers¶
The MLTK allows for sharing C/C++ code between the embedded device and host Python scripts.
This is useful as it allows for sharing the exact same data pre-processing algorithms between the embedded device and model training scripts.
This ensures the embedded ML model “sees” the same data samples as what was used to train the model which should (hopefully) lead to more accurate predictions.
See the Audio Feature Generator for details on how this is done to create spectrograms from streaming audio.
Also see the C++ Development Guide for details on how to build wrappers.
Embedded model parameters¶
The MLTK allows for embedding custom parameters into the generated .tflite
model file.
This is useful because it enables the ML model developer to distribute training settings
to the embedded device using a single file. It also ensures that the settings used to train
the model are in lock-step with the .tflite
model file.
See the Model Parameters guide for more details on how this works.
Integration with Tensorflow¶
All model layout/architecture design and training is done using stock-standard Tensorflow.
The details are contained within a single python script and exported via simple API, TrainMixin.
Integration with Tensorflow-Lite¶
Model quantization is automatically done at the end of training using the standard Tensorflow-Lite Converter.
The MLTK also features a Python API, TfliteModel to access the contents of a generated .tflite
model file.
Integration with Tensorflow-Lite Micro¶
The MLTK features a C++ Python wrapper to execute .tflite
model files in the Tensorflow-Lite Micro Interpreter from a Python script.
The Python wrapper is accessible via Python API, TfliteMicro.
This is useful as it allows for determining if a given .tflite
model is supported by TFLM and how much RAM it will consume.
Integration with the Gecko SDK¶
The Gecko SDK is able to parse the embedded parameters in
a .tflite
and generate the corresponding source code.
The MLTK example applications can also be loaded into Silicon Labs Simplicity Studio
Support for cloud development¶
MLTK model development can be done locally or in the cloud using the following features:
Training via SSH¶
The MLTK features the option to train models via SSH. This can greatly reduce model training times by leveraging cloud machines.
See the Cloud Training with vast.ai tutorial for more details.
Logging to the cloud¶
The MLTK allows for logging to the cloud during model development using Weights & Biases.
See the Cloud Logging with Weights & Biases tutorial for more details.