Concrete ML has support for serializing all available built-in models. Using this feature, one can dump a fitted and compiled model into a JSON string or file. The estimator can then be loaded back using the JSON object.
All built-in models provide the following methods:
dumps
: dumps the model as a string.
dump
: dumps the model into a file.
For example, a logistic regression model can be dumped in a string as below.
Similarly, it can be dumped into a file.
Alternatively, Concrete ML provides two equivalent global functions.
Some parameters used for instantiating Quantized Neural Network models are not supported for serialization. In particular, one cannot serialize a model that was instantiated using callable objects for the train_split
and predict_nonlinearity
parameters or with callbacks
being enabled.
Loading a built-in model is possible through the following functions:
loads
: loads the model from a string.
load
: loads the model from a file.
A loaded model is required to be compiled once again in order for a user to be able to execute the inference in FHE or with simulation. This is because the underlying FHE circuit is currently not serialized. There is not required when FHE mode is disabled.
The above logistic regression model can therefore be loaded as below.