Advanced Features

Concrete-ML offers some features for advanced users that wish to adjust the cryptographic parameters that are generated by the Concrete stack for a certain machine learning model.

Approximate computations using the p_error parameter

Concrete-ML makes use of table lookup (TLU) to represent any non-linear operation (e.g. sigmoid). This TLU is implemented through the Programmable Bootstrapping (PBS) operation which will apply a non-linear operation in the cryptographic realm.

In Concrete-ML, the result of the TLU operation is obtained with a specific error probability:

DEFAULT_P_ERROR_PBS = 6.3342483999973e-05

A single PBS operation has 1 - DEFAULT_P_ERROR_PBS = 99.9936657516% chances of being correct. This number plays a role in the cryptographic parameters. As such, the lower the p_error, the more constraining the parameters will become. This has an impact on both key generation and, more importantly, on FHE execution time.

This number is set by default to be relatively low such that any user can build deep circuits without being impacted by this noise as described in the concepts section. However, there might be use cases and specific circuits where the Gaussian noise can increase without being too dramatic for the circuit accuracy. In that case, increasing the p_error can be relevant as it will reduce the execution time in FHE.

Here is a visualization of the effect of the p_error over a simple linear regression with a p_error = 0.1 vs the default p_error value:

The execution for the two models are 336 ms per example for the standard p_error and 253 ms per example for a p_error = 0.1 (on a 8 cores Intel CPU machine). Obviously, this speedup is very dependent on model complexity. To obtain a speedup while maintaining good accuracy, it is possible to search for a good value of p_error. Currently no heuristic has been proposed to find a good value a-priori.

Users have the possibility to change this p_error as they choose fit, by passing an argument to the compile function of any of the models. Here is an example:

from concrete.ml.sklearn import XGBoostClassifier
clf = XGBoostClassifier()
clf.fit(X_train, y_train)

# Here comes the p_error parameter
clf.compile(X_train, p_error = 0.1)

Last updated