Softmax in Concrete Numpy

(originally asked by cgordon on FHE.org Discord)

Another question I had is whether there’s any issue with implementing softmax (out(z)_i =exp(z_i)/(sum_j (exp(z_j)) under the existing framework. I know it’s not in the ‘supported activations’ in the Torch converter currently, and wondered if the summation raises any issues.

Hello. Yes currently, Softmax is not implemented in the supported activations. We encourage our users to do the softmax in the clear, since it will be much more efficient and much more precise. If one wanted to do a Softmax in the ciphertexts, this could be done as:

  • k PBS, to compute u_i = \mathsf{exp}(z_i)
  • one addition to compute s = \sum_i u_i
  • 1 PBS, to compute \frac1s
  • 2k PBS, to compute \mathsf{out}_i = \frac{u_i}{s} (knowing that a * b can be computed with 2 PBS with \frac{(a+b)^2}{4} - \frac{(a-b)^2}{4}

As you can see, this would be quite inefficient and not bring that much. Plus, we may have difficulties with precision, since ranges of data can be very large and exp grows very fast.

1 Like