Model Visualizer API Examples

This demonstrates how to use the view_model API.

Refer to the Model Visualizer guide for more details.

NOTES:

  • Refer to the Notebook Examples Guide for how to run this example locally in VSCode

  • These APIs will not work on a remote server as a local Python HTTP server is used to serve the interactive webpage

  • Alternatively, drag & drop your model into http://netron.app

Install MLTK Python Package

# Install the MLTK Python package (if necessary)
!pip install --upgrade silabs-mltk

Import Python Packages

# Import the necessary MLTK APIs
from mltk.core import view_model

Example 1: View Keras model

In this example, we view the trained .h5 model file in the image_example1 model’s model archive.

NOTE: The model graph will appear in your web-browser.

view_model('image_example1')
Serving 'E:/reed/mltk/models/image_example1/extracted_archive/image_example1.h5' at http://localhost:8080
Stopping http://localhost:8080

Example 2: View Tensorflow-Lite model

In this example, we view the trained .tflite model file in the image_example1 model’s model archive.

NOTE: The model graph will appear in your web-browser.

view_model('image_example1', tflite=True)
Serving 'E:/reed/mltk/models/image_example1/extracted_archive/image_example1.tflite' at http://localhost:8080
Stopping http://localhost:8080

Example 3: View external Tensorflow-Lite model

The given model need not be generated by the MLTK. External models are also supported by the view_model API.

NOTE: The model graph will appear in your web-browser.

import os 
import tempfile
import urllib
import shutil

# Use .tflite mode found here:
# https://github.com/mlcommons/tiny/tree/master/benchmark/training/keyword_spotting/trained_models
# NOTE: Update this URL to point to your model if necessary
TFLITE_MODEL_URL = 'https://github.com/mlcommons/tiny/raw/master/benchmark/training/keyword_spotting/trained_models/kws_ref_model.tflite'

# Download the .tflite file and save to the temp dir
external_tflite_path = os.path.normpath(f'{tempfile.gettempdir()}/kws_ref_model.tflite')
with open(external_tflite_path, 'wb') as dst:
    with urllib.request.urlopen(TFLITE_MODEL_URL) as src:
        shutil.copyfileobj(src, dst)
view_model(external_tflite_path)
Serving 'E:/kws_ref_model.tflite' at http://localhost:8080
Stopping http://localhost:8080

Example 4: View model before training

Training a model can be very time-consuming, and it is useful to view a model before investing time and energy into training it.
For this reason, the MLTK view_model API features a build argument to build a model and view it before the model is fully trained.

In this example, the image_example1 model is built at api-execution-time and this file is opened in the viewer.
Note that only the model specification script is required, it does not need to be trained first.

NOTE: The model graph will appear in your web-browser.

view_model('image_example1', tflite=True, build=True)
Selecting GPU : NVIDIA GeForce RTX 2060 (id=0)
Enabling test mode
training is using 1 subprocesses
validation is using 1 subprocesses
Model: "image_example1"
_________________________________________________________________
 Layer (type)                Output Shape              Param #   
=================================================================
 conv2d (Conv2D)             (None, 48, 48, 24)        240       
                                                                 
 average_pooling2d (AverageP  (None, 24, 24, 24)       0         
 ooling2D)                                                       
                                                                 
 conv2d_1 (Conv2D)           (None, 11, 11, 16)        3472      
                                                                 
 conv2d_2 (Conv2D)           (None, 9, 9, 24)          3480      
                                                                 
 batch_normalization (BatchN  (None, 9, 9, 24)         96        
 ormalization)                                                   
                                                                 
 activation (Activation)     (None, 9, 9, 24)          0         
                                                                 
 average_pooling2d_1 (Averag  (None, 4, 4, 24)         0         
 ePooling2D)                                                     
                                                                 
 flatten (Flatten)           (None, 384)               0         
                                                                 
 dense (Dense)               (None, 3)                 1155      
                                                                 
 activation_1 (Activation)   (None, 3)                 0         
                                                                 
=================================================================
Total params: 8,443
Trainable params: 8,395
Non-trainable params: 48
_________________________________________________________________

Total MACs: 1.197 M
Total OPs: 2.528 M
Name: image_example1
Version: 1
Description: Image classifier example for detecting Rock/Paper/Scissors hand gestures in images
Classes: rock, paper, scissor
Training dataset: Found 9 samples belonging to 3 classes:
      rock = 3
     paper = 3
   scissor = 3
Validation dataset: Found 9 samples belonging to 3 classes:
      rock = 3
     paper = 3
   scissor = 3
Running cmd: c:\Users\reed\workspace\silabs\mltk\.venv\Scripts\python.exe -m pip install -U tensorflow-addons
(This may take awhile, please be patient ...)
Requirement already satisfied: tensorflow-addons in c:\users\reed\workspace\silabs\mltk\.venv\lib\site-packages (0.18.0)

Requirement already satisfied: packaging in c:\users\reed\workspace\silabs\mltk\.venv\lib\site-packages (from tensorflow-addons) (21.3)

Requirement already satisfied: typeguard>=2.7 in c:\users\reed\workspace\silabs\mltk\.venv\lib\site-packages (from tensorflow-addons) (2.13.3)

Requirement already satisfied: pyparsing!=3.0.5,>=2.0.2 in c:\users\reed\workspace\silabs\mltk\.venv\lib\site-packages (from packaging->tensorflow-addons) (3.0.9)

WARNING: You are using pip version 21.2.3; however, version 22.2.2 is available.

You should consider upgrading via the 'c:\Users\reed\workspace\silabs\mltk\.venv\Scripts\python.exe -m pip install --upgrade pip' command.

Forcing epochs=3 since test=true
Class weights:
   rock = 1.00
  paper = 1.00
scissor = 1.00
Starting model training ...
Epoch 1/3
Epoch 2/3
Epoch 3/3
Generating C:/Users/reed/.mltk/models/image_example1-test/image_example1.test.h5


*** Best training val_accuracy = 0.333


Training complete
Training logs here: C:/Users/reed/.mltk/models/image_example1-test
validation is using 1 subprocesses
Generating E:/reed/mltk/tmp_models/image_example1.tflite
WARNING:absl:Found untraced functions such as _jit_compiled_convolution_op, _jit_compiled_convolution_op, _jit_compiled_convolution_op while saving (showing 3 of 3). These functions will not be directly callable after loading.
INFO:tensorflow:Assets written to: E:\tmpsa1z1ouz\assets
Using Tensorflow-Lite Micro version: b13b48c (2022-06-08)
Searching for optimal runtime memory size ...
Determined optimal runtime memory size to be 72320
c:\Users\reed\workspace\silabs\mltk\.venv\lib\site-packages\tensorflow\lite\python\convert.py:766: UserWarning: Statistics for quantized inputs were expected, but not specified; continuing anyway.
  warnings.warn("Statistics for quantized inputs were expected, but not "
Serving 'E:/reed/mltk/tmp_models/image_example1.tflite' at http://localhost:8080
Stopping http://localhost:8080