This document explains how to serialize build-in models in Concrete ML.
Serialization allows you to dump a fitted and compiled model into a JSON string or file. You can then load the estimator back using the JSON object.
All built-in models provide the following methods:
dumps
: Dumps the model as a string.
dump
: Dumps the model into a file.
For example, a logistic regression model can be dumped in a string as follows:
Similarly, it can be dumped into a file:
Alternatively, Concrete ML provides two equivalent global functions:
Some parameters used for instantiating Quantized Neural Network models are not supported for serialization. In particular, you cannot serialize a model that was instantiated using callable objects for the train_split
and predict_nonlinearity
parameters or with callbacks
being enabled.
You can load a built-in model using the following functions:
loads
: Loads the model from a string.
load
: Loads the model from a file.
A loaded model must be compiled once again to execute the inference in FHE or with simulation because the underlying FHE circuit is currently not serialized. This recompilation is not required when FHE mode is disabled.
The same logistic regression model can be loaded as follows: