mltk.core.tflite_micro.TfliteMicroModel

class TfliteMicroModel[source]

This class wrappers the TF-Lite Micro interpreter loaded with a .tflite model

Properties

accelerator

Reference to hardware accelerator used by model

details

Return details about loaded model

input_size

Number of input tensors

is_profiler_enabled

Return if the profiler is enabled

is_recorder_enabled

Return if the model recorder is enabled

is_tensor_recorder_enabled

Return if the tensor recorder is enabled

layer_errors

List of error messages triggered by kernels while loading the model.

output_size

Number of output tensors

Methods

__init__

get_layer_error

Return the TfliteMicroLayerError at the given layer index if found else return None

get_profiling_results

Return the profiling results of each model layer

get_recorded_data

Return the recorded contents of the model

input

Return a reference to a model input tensor's data If the value argument is provided then copy the value to the input tensor's buffer

invoke

Invoke the model to execute one inference

output

Return a reference to a model output tensor's data

set_layer_callback

__init__(tflm_wrapper, tflm_accelerator, flatbuffer_data, enable_profiler=False, enable_recorder=False, enable_tensor_recorder=False, force_buffer_overlap=False, runtime_buffer_sizes=None)[source]
Parameters:
  • tflm_accelerator (TfliteMicroAccelerator) –

  • flatbuffer_data (bytes) –

  • enable_profiler (bool) –

  • enable_recorder (bool) –

  • enable_tensor_recorder (bool) –

  • force_buffer_overlap (bool) –

  • runtime_buffer_sizes (List[int]) –

property accelerator: TfliteMicroAccelerator

Reference to hardware accelerator used by model

Return type:

TfliteMicroAccelerator

property layer_errors: List[TfliteMicroLayerError]

List of error messages triggered by kernels while loading the model. Typically, these errors indicate that a given model layer is not supported by a hardware accelerator and had to fallback to a default kernel implementation.

Return type:

List[TfliteMicroLayerError]

property details: TfliteMicroModelDetails

Return details about loaded model

Return type:

TfliteMicroModelDetails

property input_size: int

Number of input tensors

Return type:

int

input(index=0, value=None)[source]

Return a reference to a model input tensor’s data If the value argument is provided then copy the value to the input tensor’s buffer

Return type:

ndarray

Parameters:

value (ndarray) –

property output_size: int

Number of output tensors

Return type:

int

output(index=0)[source]

Return a reference to a model output tensor’s data

Return type:

ndarray

invoke()[source]

Invoke the model to execute one inference

property is_profiler_enabled: bool

Return if the profiler is enabled

Return type:

bool

get_profiling_results()[source]

Return the profiling results of each model layer

Return type:

List[TfliteMicroProfiledLayerResult]

Returns:

A list where each entry contains the profiling results of the associated model layer

property is_recorder_enabled: bool

Return if the model recorder is enabled

Return type:

bool

property is_tensor_recorder_enabled: bool

Return if the tensor recorder is enabled

Return type:

bool

get_recorded_data()[source]

Return the recorded contents of the model

Return type:

Dict

Returns:

A list where each entry contains the input/output tensors of the associated model layer

get_layer_error(index)[source]

Return the TfliteMicroLayerError at the given layer index if found else return None

Return type:

TfliteMicroLayerError

Parameters:

index (int) –