Concrete-python installation on raspberry pi

We’d love to hear your results and provide support :slightly_smiling_face:

Also could you share the new set of failures? And maybe the full pytest output, there might be some stack traces, error messages, warnings, etc.

Thanks!

With 1 thread and 8Gb swap have now for it down to only 2 failures.

May be an issue with the version ofgraphviz as it is installed and get the the same Value Error as before, this is what I have installed.

(.venv) pi@pi-fhe:~/newbuild3/concrete/frontends/concrete-python $ pip install graphviz
Looking in indexes: Simple index, piwheels - Simple index
Requirement already satisfied: graphviz in ./.venv/lib/python3.11/site-packages (0.20.2)

The other crash seems a bit more fundamental and may be an issue with the actual build.

tests/compilation/test_circuit.py::test_dataflow_circuit WARNING: The generated random seed is not crypto secure
/home/pi/newbuild3/concrete/frontends/concrete-python/.venv/bin/python: symbol lookup error: /tmp/tmpyzoeh276/sharedlib.so: undefined symbol: _dfr_register_work_function

[gw0] node down: Not properly terminated

Any advice on the “test_dataflow_circuit” one and the symbol lookup error?

Full pytest output at:
https://www.dropbox.com/scl/fi/13970qviu41w2xkbhk1c0/8Gb_swap_one_thread_nohup.out?rlkey=mbue4qftyjxn7dtxm3oc31n94&dl=0

Thanks

Graphviz should be install through apt or the native package manager of the system (e.g. Ubuntu – Package Search Results -- graphviz). Python package is called pygraphviz and it should be installed in dev profile.

As for dataflow, it’s only available in x86 Linux, not even on x86 macOS. We literally have PYTEST_MARKERS="not dataflow and not graphviz" in our macOS CI :smile:

So the most important functionality works on Raspberry Pi, thank you for the effort and congratulations :clap:

1 Like

I have fixed the graphviz issue (did a pip uninstall, and installed it again. That test now passes ok

INSECURE_KEY_CACHE_LOCATION=/home/pi/.cache/concrete-python/pytest
============================= test session starts ==============================
platform linux – Python 3.11.2, pytest-7.2.2, pluggy-1.4.0 – /home/pi/newbuild3/concrete/frontends/concrete-python/.venv/bin/python
cachedir: .pytest_cache
Using --randomly-seed=2389925685
rootdir: /home/pi/newbuild3/concrete/frontends/concrete-python, configfile: pytest.ini
plugins: cov-4.0.0, xdist-3.2.1, randomly-3.15.0
collecting … collected 1 item

tests/compilation/test_circuit.py::test_circuit_draw PASSED

Still issue with the other failure.

1 Like

Hello @DaveCT , the error with dataflow on pytest is because you don’t compile the dataflow runtime (you should set DATAFLOW_EXECUTION_ENABLED to ON) when you have compiled the python bindings of the compiler. But you can skip it as @umutsahin says.
Dataflow execution enables dataflow tasks parallelism which can improve the parallelization of your program.

1 Like

Thanks @yundsi @umutsahin - So I should recompile the python bindings?

Sure, after installing HPX. You can do it in the compiler directory using make install-hpx-from-source and then compiling with DATAFLOW_EXECUTION_ENABLED on.

1 Like

Looks like it will need to be recompiled (will try that next) as if I create the wheel and try installing it in a new venv (dave2) then the simple example in the ReadMe I get the same error:

(dave2) pi@pi-fhe:~/dave2 $ cat test.py 
from concrete import fhe

def add(x, y):
    return x + y

compiler = fhe.Compiler(add, {"x": "encrypted", "y": "encrypted"})
inputset = [(2, 3), (0, 0), (1, 6), (7, 7), (7, 1), (3, 2), (6, 1), (1, 7), (4, 5), (5, 4)]

print(f"Compiling...")
circuit = compiler.compile(inputset)

print(f"Generating keys...")
circuit.keygen()

examples = [(3, 4), (1, 2), (7, 7), (0, 0)]
for example in examples:
    encrypted_example = circuit.encrypt(*example)
    encrypted_result = circuit.run(encrypted_example)
    result = circuit.decrypt(encrypted_result)
    print(f"Evaluation of {' + '.join(map(str, example))} homomorphically = {result}")

Output:

(dave2) pi@pi-fhe:~/dave2 $ python test.py 
Compiling...
Generating keys...
WARNING: The generated random seed is not crypto secure
WARNING: The generated random seed is not crypto secure
WARNING: The generated random seed is not crypto secure
python: symbol lookup error: /tmp/tmprz64r205/sharedlib.so: undefined symbol: _dfr_start

Did you do this?

DATAFLOW_EXECUTION_ENABLED=ON requires HPX even if dataflow_parallelize is set to False from Python.

Just starting with a fresh build so i can document the steps I took, the compiling also takes a few hours each time so will make sure I make install-hpx-from-source this time and test it all again.

1 Like

I get an error when trying to compile hpx. It looks like I may need to compile if from source directly then set the directory.

[  3%] Building CXX object libs/core/ini/CMakeFiles/hpx_ini.dir/src/ini.cpp.o
In file included from /home/pi/newbuild5/concrete/compilers/concrete-compiler/compiler/hpx-1.9.1/libs/core/coroutines/include/hpx/coroutines/detail/context_impl.hpp:126,
                 from /home/pi/newbuild5/concrete/compilers/concrete-compiler/compiler/hpx-1.9.1/libs/core/coroutines/include/hpx/coroutines/detail/context_base.hpp:36,
                 from /home/pi/newbuild5/concrete/compilers/concrete-compiler/compiler/hpx-1.9.1/libs/core/coroutines/src/detail/context_base.cpp:12:
/home/pi/newbuild5/concrete/compilers/concrete-compiler/compiler/hpx-1.9.1/libs/core/coroutines/include/hpx/coroutines/detail/context_linux_x86.hpp: In member function ‘void hpx::threads::coroutines::detail::lx::x86_linux_context_impl_base::prefetch() const’:
/home/pi/newbuild5/concrete/compilers/concrete-compiler/compiler/hpx-1.9.1/libs/core/coroutines/include/hpx/coroutines/detail/context_linux_x86.hpp:206:41: error: static assertion failed
  206 |             static_assert(sizeof(void*) == 4);
      |                           ~~~~~~~~~~~~~~^~~~
/home/pi/newbuild5/concrete/compilers/concrete-compiler/compiler/hpx-1.9.1/libs/core/coroutines/include/hpx/coroutines/detail/context_linux_x86.hpp:206:41: note: the comparison reduces to ‘(8 == 4)’
make[3]: *** [libs/core/coroutines/CMakeFiles/hpx_coroutines.dir/build.make:76: libs/core/coroutines/CMakeFiles/hpx_coroutines.dir/src/detail/context_base.cpp.o] Error 1
make[3]: Leaving directory '/home/pi/newbuild5/concrete/compilers/concrete-compiler/compiler/hpx-1.9.1/build'
make[2]: *** [CMakeFiles/Makefile2:75593: libs/core/coroutines/CMakeFiles/hpx_coroutines.dir/all] Error 2
make[2]: *** Waiting for unfinished jobs....
make[3]: Leaving directory '/home/pi/newbuild5/concrete/compilers/concrete-compiler/compiler/hpx-1.9.1/build'
[  3%] Built target hpx_ini
make[2]: Leaving directory '/home/pi/newbuild5/concrete/compilers/concrete-compiler/compiler/hpx-1.9.1/build'
make[1]: *** [Makefile:146: all] Error 2
make[1]: Leaving directory '/home/pi/newbuild5/concrete/compilers/concrete-compiler/compiler/hpx-1.9.1/build'
make: *** [Makefile:128: install-hpx-from-source] Error 2

Looks to be using x86 - context_linux_x86.hpp

Are these related:

I managed to get hpx complied and installed under /usr/local on the system once I got the dependencies installed locally. I then used the commands (already had boost installed)

sudo apt install libgoogle-perftools-dev
git clone https://github.com/STEllAR-GROUP/hpx.git
cd hpx 
mkdir build && cd build
cmake -DCMAKE_BUILD_TYPE=Release -DCMAKE_INSTALL_PREFIX=/usr/local -DHPX_WITH_FETCH_ASIO=ON ../hpx
make -j$(nproc)
sudo make install
sudo ldconfig

This all seemed to work fine.

What do you think I may need to update so that it knows to use this? I tried setting
export HPX_INSTALL_DIR=/usr/local/
But still get the same issue when I make the python-bindings.

It looks like the runtime library (libConcretelangRuntime.so) is not found.
Most likely the runtime lib wasn’t installed correctly in the venv. You might have not run the installation of requirements which runs auditwheel perhaps?

One thing you could try before attempting a clean re-build+install would be to use LD_PRELOAD=/path/to/runtime/lib/libConcretelangRuntime.so
If this doesn’t work, try something like find [venv] -name *.so | grep -i concrete to check if the library is present.

1 Like

I think you are right as nothing comes back with find command. I will do another clean build / install and check the auditwheel part.

If the LD_PRELOAD works you may want to stick with that, or otherwise you can try to copy explicitly to the venv for example something like this could help:

cp -R concrete/frontends/concrete-python/concrete/* .venv/lib/python3.*/site-packages/concrete/

I must be doing something fundamentally wrong.
I did a clean build and created the wheel.
I then get this error:

compiling...
Generating keys...
WARNING: The generated random seed is not crypto secure
WARNING: The generated random seed is not crypto secure
WARNING: The generated random seed is not crypto secure
python: symbol lookup error: /tmp/tmprz64r205/sharedlib.so: undefined symbol: _dfr_start

If i then do the following it seems to break something

(dave5) pi@pi-fhe:~ $ LD_PRELOAD=/home/pi/newbuild6/concrete/compilers/concrete-compiler/compiler/build/lib/libConcretelangRuntime.so
(dave5) pi@pi-fhe:~ $ python ./dave2/test.py 
Compiling...
Traceback (most recent call last):
  File "/home/pi/./dave2/test.py", line 10, in <module>
    circuit = compiler.compile(inputset)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/pi/dave5/lib/python3.11/site-packages/concrete/fhe/compilation/compiler.py", line 521, in compile
    mlir_module = GraphConverter(self.configuration).convert(self.graph, mlir_context)
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/pi/dave5/lib/python3.11/site-packages/concrete/fhe/mlir/converter.py", line 129, in convert
    return self.convert_many({name: graph}, mlir_context)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/pi/dave5/lib/python3.11/site-packages/concrete/fhe/mlir/converter.py", line 60, in convert_many
    self.process(graphs)
  File "/home/pi/dave5/lib/python3.11/site-packages/concrete/fhe/mlir/converter.py", line 240, in process
    processor.apply_many(graphs)
  File "/home/pi/dave5/lib/python3.11/site-packages/concrete/fhe/mlir/processors/assign_bit_widths.py", line 115, in apply_many
    new_bit_width = model[bit_width].as_long()
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/pi/dave5/lib/python3.11/site-packages/z3/z3.py", line 3023, in as_long
    return int(self.as_string())
           ^^^^^^^^^^^^^^^^^^^^^

I think I need to go back to the start:

Would you be able to let me know the steps I should go through for a new build from source and then creating a wheel for it?

I think you need to export on LD_PRELOAD or have it on same command line otherwise it won’t be set as environment variable when executing

So either do:

(dave5) pi@pi-fhe:~ $ LD_PRELOAD=/home/pi/newbuild6/concrete/compilers/concrete-compiler/compiler/build/lib/libConcretelangRuntime.so python ./dave2/test.py 

Or otherwise:

(dave5) pi@pi-fhe:~ $ export LD_PRELOAD=/home/pi/newbuild6/concrete/compilers/concrete-compiler/compiler/build/lib/libConcretelangRuntime.so
(dave5) pi@pi-fhe:~ $ python ./dave2/test.py

I get a different issue with this:

(dave7) pi@pi-fhe:~ $ LD_PRELOAD=/home/pi/newbuild6/concrete/compilers/concrete-compiler/compiler/build/lib/libConcretelangRuntime.so python /home/pi/dave2/test.py 
: CommandLine Error: Option 'use-dbg-addr' registered more than once!
LLVM ERROR: inconsistency in registered CommandLine options
Aborted