Questions regarding the Whitepaper of PBS for DNN inference

Hi,

I read the whitepaper (Programmable Bootstrapping Enables Efficient Homomorphic Inference of Deep Neural Networks) on programmable bootstrapping and deep neural network inference. I feel very impressed and excited by the ability of TFHE for fast bootstrapping.

I’d like to know some details about the experiments you described in this paper. Particularly:

  1. How is the input MNIST image encoded? From my understanding, each pixel of a single image is encoded and encrypted into a LWE ciphertext. Am I right?

  2. What is the architecture of the NN-20, NN-50 and NN-100, and how many PBS are there in them?

I’m asking because it is reported in another paper ( CONCRETE: Concrete Operates oN Ciphertexts Rapidly by Extending TfhE ) that a PBS takes about 18 ms (N=1024, n=630, 64-bit) on CPU. The NN-20, NN-50, NN-100 inference time is < 60s. Then it seems there are very few PBS in them?

  1. Are there any parallel computing techniques compatible with the Concrete library used in the experiments so that it can utilize the multicore CPUs?

Thank you very much!

Tian

Hello @tiany

Thanks for your interest in Zama. Regarding your questions:

  1. Yes you have one cipher text per pixel

  2. First layer is:

  • a convolution followed by RELU activation
    then
  • dense layers with 92 output neurons followed by RELU activations.

In terms of PBS the first layer has 840 PBS, and each other layer have one per neuron (so 92 PBS/layer)

  1. We used custom parallel operations for this experiment as a Proof of Concept. It will be our compiler job to that automatically.

Hello, can I get more detail information for the TFHE parameter and the NN architecture?

  1. For TFHE parameter, what is the bootstrapping level and keyswitching level used in this experiment?
  2. For NN architecture, what is the convolution window size that you use? I still don’t understand how you can get 840 PBS for the first layer. With the original image size is 28x28, if you want to do convolution with resulting in same output size, you will only need to perform 784 PBS
  3. Also, did you always do the keyswitching operation before PBS in this experiments?

Thank you

Hey @Dwen

So in order:

  1. For the keyswitching key:
  • level: 4
  • base_log: 4
    For the bootstrapping key:
  • level: 4
  • base_log: 13
  1. So for the number of PBS I’ll paste here the parameters we had for the first convolution layer, the output is 1,2,21,20 in shape which adds up to 840 :slightly_smiling_face:
        "1": {
            "node_type": {
                "ConvMultisum": {
                    "weight": {
                        "file": "weight_1_casted.npz",
                        "shape": [
                            2,
                            1,
                            10,
                            11
                        ],
                        "inside_type": "uint64"
                    },
                    "bias": {
                        "file": "bias_1_casted.npz",
                        "shape": [
                            2
                        ],
                        "inside_type": "uint64"
                    },
                    "dilatations": [
                        1,
                        1
                    ],
                    "group": 1,
                    "kernel_shape": [
                        10,
                        11
                    ],
                    "pads": [
                        1,
                        1,
                        1,
                        1
                    ],
                    "strides": [
                        1,
                        1
                    ]
                }
            },
            "sources": [
                {
                    "name": "0",
                }
            ],
            "output_shape": [
                1,
                2,
                21,
                20
            ],
  1. So I think back then we did the keyswitch after the PBS but this approach has changed IIRC and I think we now do the keyswitch before the PBS
1 Like