Hello @churdo , there was a check on the the ciphertext size in Concrete, I don’t remember the reason it was there in the first place to be honest, but that check raised this error.
We changed the check so that it shouldn’t crash with “heavy” ciphertexts anymore.
Yes @luis , the error with there versioning was solve but still facing issue for running other llms.
How can i find module name for different llm that i need to pass when i am compiling a model?
Like for the example given in docs i am getting error that no module found python compile_hybrid_llm.py --model-name microsoft/phi-1_5 --module-names layers.1.mixer.Wqkv
And currently the hybrid llm is using cpu for compute how can i run it on gpu so that inference should be fast?