Compile, install and run Primihub from source code under Centos8

References

  • PrimiHub local compilation starts
  • How to install Bazel on CentOS 8 Linux or Redhat 8/7

Compile and start steps

Due to historical reasons, the server is the Centos8 operating system, so the source code compilation is abnormally troublesome. It is hereby recorded as follows.

Using the source code compilation method can deeply understand the whole process in a step-by-step operation process.

1. Compile environment installation

Note that there are many differences between Centos8 and Centos7, and some libraries are not supported on centos8, so the installation process is different from the official documentation.

Basic environment

Requires python3.8, assuming python3.8 has been successfully installed, perform the following installation.

sudo dnf -y install epel-release
sudo dnf -y group install "Development Tools"
sudo dnf -y install python38-devel gmp-devel libtool ninja-build git npm gcc make
sudo dnf -y --enablerepo=PowerTools install ninja-build

// Check the version of libstdc++.so.6 link, if it is the default 6.0.19, you need to upgrade the version
ls -l /usr/lib64/libstdc++ .so.6
wget https://primihub.oss-cn-beijing.aliyuncs.com/tools/libstdc.so_.6.0.26.zip
unzip libstdc.so_.6.0.26.zip
sudo mv libstdc++.so.6.0.26 /usr/lib64
sudo rm -f /usr/lib64/libstdc++.so.6
sudo ln -s /usr/lib64/libstdc++ + .so.6.0.26 /usr/lib64/libstdc++ + .so.6

// Bazel environment
sudo dnf config-manager --add-repo https://copr.fedorainfracloud.org/coprs/vbatts/bazel/repo/epel-8/vbatts-bazel-epel-8.repo

sudo dnf -y install bazel5

cd "/usr/bin" & amp; & amp; sudo curl -fLO https://releases.bazel.build/5.0.0/release/bazel-5.0.0-linux-x86_64 & amp; & amp; sudo chmod +x bazel-5.0.0-linux-x86_64

bazel --version

redis environment installation

sudo dnf install redis -y

Then modify the requirepass field in the /etc/redis/redis.conf file to set the redis password, which needs to be the same as ./config/node*.yaml code> file with the same settings for the redis_password field.

sudo sed -i 's/# requirepass foobared/requirepass primihub/' /etc/redis.conf
// replace the default port
sudo sed -i 's/port 6379/port 8391/' /etc/redis.conf

start redis

sudo systemctl start redis

2. Compile

cd primihub
./pre_build.sh

make linux_x86_64

Summary of problems during compilation

Question 1: Prompt that WORKSPACE_CN cannot be found

The solution is to directly paste the content in WORKSPACE_CN to WORKSPACE.

Question 2: The problem that github cannot be accessed

Solution:

  1. use proxy
  2. Or replace github.com in WORKSPACE with the mirror address, such as gitclone.com/github.com. This method is used after testing because the address is constantly updated.

After method 2 is set, remember to clean up the temporary files of bazel:

bazel clean --expunge

Question 3: Prompt for a package download timeout

  1. Recompile, try make linux_x86_64 several times.
  2. It is really impossible to download. After downloading on the PC side of FQ, upload it to the place where the error prompts /home/baas/.cache/bazel/_bazel_baas/c4c8cad6a1643b7f6bba3835e75e462e/external/rules_java/temp2465207077681073091/:
ERROR: /home/baas/codes/mpc/primihub/BUILD.bazel:69:10: While resolving toolchains for target //:py_main: invalid registered toolchain '@local_jdk//:runtime_toolchain_definition': no such package '@rules_java//java': java.io.IOException: Error downloading [https://github.com/bazelbuild/rules_java/archive/981f06c3d2bd10225e85209904090eb7b5fb26bd.tar.gz] to /home/baas/.cache/bazel /_bazel_baas/c4c8cad6a1643b7f6bba3835e75e462e/external/rules_java/temp2465207077681073091/981f06c3d2bd10225e85209904090eb7b5fb26bd.tar.gz: connect timed out
  1. If many packages cannot be downloaded (network problem), you can copy the temporary directory /home/baas/.cache/bazel/_bazel_baas/c4c8cad6a1643b7f6bba3835e75e462e/external on another machine to this machine.

Question 4: cannot find -lpython3.7

Specific issues:

ERROR: /home/baas/codes/mpc/primihub/src/primihub/pybind_warpper/BUILD:23:17: Linking src/primihub/pybind_warpper/opt_paillier_c2py.so failed: (Exit 1): gcc failed: error executing command /usr/bin/gcc @bazel-out/k8-fastbuild/bin/src/primihub/pybind_warpper/opt_paillier_c2py.so-2.params

Reason for problem:

  1. I used python3.7 to compile once before, and as a result, LACEHOLDER-PYTHON3.X-CONFIG in the BUILD.bazel file was replaced by Parameters for python3.7.

  2. Replace the script with reference to line 47 of pre_build.sh: sed -e "s|PLACEHOLDER-PYTHON3.X-CONFIG|${NEWLINE}|g" BUILD.bazel > BUILD.bazel .tmp & amp; & amp; mv BUILD.bazel.tmp BUILD.bazel

Workaround:

  1. Replace the original BUILD.bazel
  2. Or replace LINK_PYTHON_OPTS = xxxxxx in the BUILD.bazel file with the original LINK_PYTHON_OPTS = PLACEHOLDER-PYTHON3.X-CONFIG
  3. After completing step 1 or step 2, re-execute ./pre_build.sh

Question 5: requires dynamic R_X86_64_32 reloc which may overflow at runtime; recompile with -fPIC

List of specific questions:

/usr/bin/ld.gold: error: /usr/local/lib/python3.8/config-3.8-x86_64-linux-gnu/libpython3.8.a(abstract.o): requires dynamic R_X86_64_32 reloc which may overflow at runtime; recompile with -fPIC
/usr/bin/ld.gold: error: /usr/local/lib/python3.8/config-3.8-x86_64-linux-gnu/libpython3.8.a(boolobject.o): requires unsupported dynamic reloc 11; recompile with -fPIC
/usr/bin/ld.gold: error: /usr/local/lib/python3.8/config-3.8-x86_64-linux-gnu/libpython3.8.a(bytearrayobject.o): requires dynamic R_X86_64_32 reloc against ' _Py_NoneStruct' which may overflow at runtime; recompile with -fPIC
?…

Solution: Use -fPIC to recompile Python3.8 and install it

cd /opt/Python-3.8.16 // My installation path
sudo ./configure --enable-optimizations CFLAGS="-fPIC"
sudo make clean
sudo make altinstall

After compiling, recompile primihub: make linux_x86_64

3. Access data

Check the start_server.sh file, the configuration file to start each node is in the config directory:

-config
|-- node0.yaml
|-- node1.yaml
|-- node2.yaml

The datasets field in the configuration file sets its corresponding dataset, and the fields of datasets are defined as follows:

  • description: A unique human-readable name for the dataset
  • model: data source type, csv in the example
  • source: The absolute path of the data source on the Node running machine

node0.yaml is configured as follows:

# load datasets
datasets:
  # ABY3 LR test case datasets
  - description: "train_party_0"
    model: "csv"
    source: "data/train_party_0.csv"
  - description: "test_party_0"
    model: "csv"
    source: "data/test_party_0.csv"
  - description: "breast_0"
    model: "csv"
    source: "data/FL/wisconsin.data"

  # MNIST test case datasets
  - description: "test_party_0_self"
    model: "csv"
    source: "data/falcon/dataset/MNIST/input_0"
  - description: "test_party_0_next"
    model: "csv"
    source: "data/falcon/dataset/MNIST/input_1"

  # FL homo lr test case datasets
  - description: "homo_lr_data"
    model: "csv"
    source: "data/FL/homo_lr/breast_cancer.csv"
  - description: "train_homo_lr"
    model: "csv"
    source: "data/FL/homo_lr/train_breast_cancer.csv"

  # PSI test case datasets for sqlite database
  - description: "psi_client_data_db"
    model: "sqlite"
    table_name: "psi_client_data"
    source: "data/client_e.db3"

    # Dataset authorization
    # authorization:
    # - node:
    # task:
  # PSI test caset datasets
  - description: "psi_client_data"
    model: "csv"
    source: "data/client_e.csv"

  # use mysql table as dataset
  #- description: "psi_client"
  # model: "mysql"
  # host: "172.21.1.62"
  # port: 3306
  # username: "root"
  # password: "primihub"
  # database: "default"
  # dbName: "psi"
  # tableName: "psi_client"
  # query_index: "ID" ## [[optional]]

4. Run service node

For other configuration information modification, see the config/node*.yaml file.

Execute the following command in the code root directory:

sed -i /PYTHONPATH/d start_server.sh
bash start_server.sh

Three service nodes will be started, and their related logs will be saved in log_node0, log_node1, log_node2 respectively

5. Create task

Task operation can refer to the official document.

Template tasks include:

  • Federated Learning (FL) tasks
  • Privacy Suggestions (PSI) Tasks
  • Incognito Inquiry (PIR) Tasks
  • Trusted Execution Environment (TEE) tasks