Can I use any activation function?

The currently tested activation functions for torch are:

  • nn.Sigmoid
  • nn.ReLU
  • nn.ReLU6
  • nn.Tanh
  • nn.Hardtanh
  • nn.CELU
  • nn.Softplus
  • nn.ELU
  • nn.LeakyReLU
  • nn.SELU

and should be usable out of the box.

We use ONNX as an intermediate representation for Machine Learning models so as long as your model converts to an ONNX model using only supported ONNX ops with a Quantized version of the operator in Concrete ML listed here, then it may be possible to convert it to an FHE circuit with Post Training Quantization.

If you use ONNX ops supported by Concrete who may not have a Quantized version listed here, then you may be able to convert your torch model to a Numpy circuit but it most likely will not be convertible easily to an FHE equivalent.

If the model uses ops that Concrete ML does not support then it won’t be convertible to a Numpy equivalent and therefore not to an FHE equivalent.