ERROR in running hybrid model example

Hello @churdo , there was a check on the the ciphertext size in Concrete, I don’t remember the reason it was there in the first place to be honest, but that check raised this error.

We changed the check so that it shouldn’t crash with “heavy” ciphertexts anymore.

By the way @Varun_Joshi, is there anything that you can give us to help you debug your issue?

Yes @luis , the error with there versioning was solve but still facing issue for running other llms.
How can i find module name for different llm that i need to pass when i am compiling a model?
Like for the example given in docs i am getting error that no module found
python compile_hybrid_llm.py --model-name microsoft/phi-1_5 --module-names layers.1.mixer.Wqkv

And currently the hybrid llm is using cpu for compute how can i run it on gpu so that inference should be fast?

@luis This issue is solved now looks like the wrong module-names was given in the docs which is corrected by jfrery now.

Hello @Varun_Joshi , let’s continue the discussion about hybrid and GPU on discord since I believe you started this thread :wink:

Yes, let’s post in a single thread and not duplicate the questions please. It will help the support to be more efficient. Thanks