官方的预构建包对很多 Native Libraries 功能扩展支持不是很完善,需要重新进行构建,本文演示 ARMv8 环境编译,AMD64 步骤完全相同。
过程
基础环境 CentOS 8.5 (aarch64) 最小化安装实例,CPU 是华为鲲鹏 920(国产 ARMv8 指令集)。环境只建议使用 CentOS 8.5 版本,所有依赖、编译器都完全匹配,其他发行版可能需要自行编译所有依赖版本,因 CentOS 8 已经停止维护,因此需要手动配置软件源,具体可参考清华源说明,建议编译前完整升级一遍系统并重启后继续操作。
## 替换为清华源
minorver=8.5.2111
sudo sed -e "s|^mirrorlist=|#mirrorlist=|g" \
-e "s|^#baseurl=http://mirror.centos.org/\$contentdir/\$releasever|baseurl=https://mirrors.tuna.tsinghua.edu.cn/centos-vault/$minorver|g" \
-i.bak \
/etc/yum.repos.d/CentOS-*.repo
在 CentOS 8.x 中部分依赖包在 PowerTools 仓库中(如果提示命令不存在,则需要安装 dnf-plugins-core 后再次尝试)
sudo dnf config-manager --set-enabled powertools
升级完毕后检查系统版本(如果提示命令未找到,手动安装 redhat-lsb-core
)
$ lsb_release -a
LSB Version: :core-4.1-aarch64:core-4.1-noarch
Distributor ID: CentOS
Description: CentOS Linux release 8.5.2111
Release: 8.5.2111
Codename: n/a
环境
Oracle JDK
因为 Hadoop 及相关套件都是基于 Java 编写的,先安装基础环境。理论上 OpenJDK 亦可使用,不过谨慎起见,在 Oracle JDK 官网下载 JDK 安装包。
## 若下载版本为 8u333 (x86 架构仅文件后缀不同)
sudo dnf localinstall jdk-8u333-linux-aarch64.rpm
安装后检查版本
$ java -version
java version "1.8.0_333"
Java(TM) SE Runtime Environment (build 1.8.0_333-b02)
Java HotSpot(TM) 64-Bit Server VM (build 25.333-b02, mixed mode)
Maven
然后部署 Maven ,提供 Java 构建环境。
wget https://dlcdn.apache.org/maven/maven-3/3.8.6/binaries/apache-maven-3.8.6-bin.tar.gz
sudo tar xf apache-maven-3.8.6-bin.tar.gz -C /opt/
添加全局环境变量
sudo vim /etc/profile.d/maven.sh
## 添加以下内容
export JAVA_HOME="/usr/java/jdk1.8.0_333-aarch64"
export PATH=$JAVA_HOME/bin:$PATH
export M2_HOME="/opt/apache-maven-3.8.6"
export MAVEN_HOME="/opt/apache-maven-3.8.6"
export PATH=$M2_HOME/bin:$PATH
## 保存后生效环境环境变量或者重新登录终端
source /etc/profile
System Depends
然后准备 Native Libraries 编译环境,官方演示编译环境为 Ubuntu,因此依赖需要更换为 RedHat 系的包名。
sudo dnf groupinstall 'Development Tools' -y
然后安装 Native Depends
sudo dnf install -y bzip2-devel \
cyrus-sasl-devel \
fuse-devel \
libzstd-devel \
openssl-devel \
protobuf-devel \
snappy-devel \
zlib-devel
Build Tools
接下来安装编译工具
sudo dnf install -y cmake expect
安装完成后检查系统组件版本
$ cmake --version
cmake version 3.20.2
CMake suite maintained and supported by Kitware (kitware.com/cmake).
$ protoc --version
libprotoc 3.5.0
构建
下载 Hadoop 3.3.4 版本源码包,解压并开始构建
wget https://dlcdn.apache.org/hadoop/common/hadoop-3.3.4/hadoop-3.3.4-src.tar.gz
tar xf hadoop-3.3.4-src.tar.gz
cd hadoop-3.3.4-src/
mvn clean package -Pdist,native, -DskipTests -Dtar -Dmaven.javadoc-skip=true -X
4核8G情况下,x86 架构构建大约十分钟,ARM 架构会稍微长一些,看到以下提示即为构建成功。
[INFO] No site descriptor found: nothing to attach.
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary for Apache Hadoop Main 3.3.4:
[INFO]
[INFO] Apache Hadoop Main ................................. SUCCESS [01:44 min]
[INFO] Apache Hadoop Build Tools .......................... SUCCESS [ 50.957 s]
[INFO] Apache Hadoop Project POM .......................... SUCCESS [ 33.850 s]
[INFO] Apache Hadoop Annotations .......................... SUCCESS [ 15.610 s]
[INFO] Apache Hadoop Assemblies ........................... SUCCESS [ 0.238 s]
[INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [ 40.087 s]
[INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [01:32 min]
[INFO] Apache Hadoop MiniKDC .............................. SUCCESS [ 37.855 s]
[INFO] Apache Hadoop Auth ................................. SUCCESS [03:23 min]
[INFO] Apache Hadoop Auth Examples ........................ SUCCESS [ 12.186 s]
[INFO] Apache Hadoop Common ............................... SUCCESS [04:04 min]
[INFO] Apache Hadoop NFS .................................. SUCCESS [ 10.798 s]
[INFO] Apache Hadoop KMS .................................. SUCCESS [ 10.731 s]
[INFO] Apache Hadoop Registry ............................. SUCCESS [ 11.185 s]
[INFO] Apache Hadoop Common Project ....................... SUCCESS [ 0.113 s]
[INFO] Apache Hadoop HDFS Client .......................... SUCCESS [02:31 min]
[INFO] Apache Hadoop HDFS ................................. SUCCESS [02:32 min]
[INFO] Apache Hadoop HDFS Native Client ................... SUCCESS [04:10 min]
[INFO] Apache Hadoop HttpFS ............................... SUCCESS [ 15.166 s]
[INFO] Apache Hadoop HDFS-NFS ............................. SUCCESS [ 5.993 s]
[INFO] Apache Hadoop HDFS-RBF ............................. SUCCESS [ 47.753 s]
[INFO] Apache Hadoop HDFS Project ......................... SUCCESS [ 0.134 s]
[INFO] Apache Hadoop YARN ................................. SUCCESS [ 0.099 s]
[INFO] Apache Hadoop YARN API ............................. SUCCESS [ 43.824 s]
[INFO] Apache Hadoop YARN Common .......................... SUCCESS [01:21 min]
[INFO] Apache Hadoop YARN Server .......................... SUCCESS [ 0.120 s]
[INFO] Apache Hadoop YARN Server Common ................... SUCCESS [ 32.721 s]
[INFO] Apache Hadoop YARN NodeManager ..................... SUCCESS [01:52 min]
[INFO] Apache Hadoop YARN Web Proxy ....................... SUCCESS [ 8.200 s]
[INFO] Apache Hadoop YARN ApplicationHistoryService ....... SUCCESS [ 13.786 s]
[INFO] Apache Hadoop YARN Timeline Service ................ SUCCESS [ 9.491 s]
[INFO] Apache Hadoop YARN ResourceManager ................. SUCCESS [ 42.799 s]
[INFO] Apache Hadoop YARN Server Tests .................... SUCCESS [ 3.051 s]
[INFO] Apache Hadoop YARN Client .......................... SUCCESS [ 14.586 s]
[INFO] Apache Hadoop YARN SharedCacheManager .............. SUCCESS [ 4.966 s]
[INFO] Apache Hadoop YARN Timeline Plugin Storage ......... SUCCESS [ 4.702 s]
[INFO] Apache Hadoop YARN TimelineService HBase Backend ... SUCCESS [ 0.120 s]
[INFO] Apache Hadoop YARN TimelineService HBase Common .... SUCCESS [01:13 min]
[INFO] Apache Hadoop YARN TimelineService HBase Client .... SUCCESS [01:05 min]
[INFO] Apache Hadoop YARN TimelineService HBase Servers ... SUCCESS [ 0.135 s]
[INFO] Apache Hadoop YARN TimelineService HBase Server 1.2 SUCCESS [ 8.158 s]
[INFO] Apache Hadoop YARN TimelineService HBase tests ..... SUCCESS [01:13 min]
[INFO] Apache Hadoop YARN Router .......................... SUCCESS [ 7.110 s]
[INFO] Apache Hadoop YARN TimelineService DocumentStore ... SUCCESS [ 53.620 s]
[INFO] Apache Hadoop YARN Applications .................... SUCCESS [ 0.160 s]
[INFO] Apache Hadoop YARN DistributedShell ................ SUCCESS [ 5.134 s]
[INFO] Apache Hadoop YARN Unmanaged Am Launcher ........... SUCCESS [ 3.654 s]
[INFO] Apache Hadoop MapReduce Client ..................... SUCCESS [ 0.997 s]
[INFO] Apache Hadoop MapReduce Core ....................... SUCCESS [ 14.424 s]
[INFO] Apache Hadoop MapReduce Common ..................... SUCCESS [ 12.152 s]
[INFO] Apache Hadoop MapReduce Shuffle .................... SUCCESS [ 6.625 s]
[INFO] Apache Hadoop MapReduce App ........................ SUCCESS [ 17.115 s]
[INFO] Apache Hadoop MapReduce HistoryServer .............. SUCCESS [ 9.008 s]
[INFO] Apache Hadoop MapReduce JobClient .................. SUCCESS [ 11.086 s]
[INFO] Apache Hadoop Mini-Cluster ......................... SUCCESS [ 2.669 s]
[INFO] Apache Hadoop YARN Services ........................ SUCCESS [ 0.122 s]
[INFO] Apache Hadoop YARN Services Core ................... SUCCESS [ 13.738 s]
[INFO] Apache Hadoop YARN Services API .................... SUCCESS [ 3.702 s]
[INFO] Apache Hadoop YARN Application Catalog ............. SUCCESS [ 0.102 s]
[INFO] Apache Hadoop YARN Application Catalog Webapp ...... SUCCESS [06:36 min]
[INFO] Apache Hadoop YARN Application Catalog Docker Image SUCCESS [ 0.160 s]
[INFO] Apache Hadoop YARN Application MaWo ................ SUCCESS [ 0.106 s]
[INFO] Apache Hadoop YARN Application MaWo Core ........... SUCCESS [ 4.532 s]
[INFO] Apache Hadoop YARN Site ............................ SUCCESS [ 0.095 s]
[INFO] Apache Hadoop YARN Registry ........................ SUCCESS [ 1.001 s]
[INFO] Apache Hadoop YARN UI .............................. SUCCESS [ 0.109 s]
[INFO] Apache Hadoop YARN CSI ............................. SUCCESS [ 48.179 s]
[INFO] Apache Hadoop YARN Project ......................... SUCCESS [ 27.848 s]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ...... SUCCESS [ 3.452 s]
[INFO] Apache Hadoop MapReduce NativeTask ................. SUCCESS [ 57.587 s]
[INFO] Apache Hadoop MapReduce Uploader ................... SUCCESS [ 3.501 s]
[INFO] Apache Hadoop MapReduce Examples ................... SUCCESS [ 7.639 s]
[INFO] Apache Hadoop MapReduce ............................ SUCCESS [ 9.828 s]
[INFO] Apache Hadoop MapReduce Streaming .................. SUCCESS [ 11.495 s]
[INFO] Apache Hadoop Distributed Copy ..................... SUCCESS [ 7.635 s]
[INFO] Apache Hadoop Client Aggregator .................... SUCCESS [ 3.421 s]
[INFO] Apache Hadoop Dynamometer Workload Simulator ....... SUCCESS [ 5.070 s]
[INFO] Apache Hadoop Dynamometer Cluster Simulator ........ SUCCESS [ 6.368 s]
[INFO] Apache Hadoop Dynamometer Block Listing Generator .. SUCCESS [ 4.117 s]
[INFO] Apache Hadoop Dynamometer Dist ..................... SUCCESS [ 11.894 s]
[INFO] Apache Hadoop Dynamometer .......................... SUCCESS [ 0.100 s]
[INFO] Apache Hadoop Archives ............................. SUCCESS [ 4.507 s]
[INFO] Apache Hadoop Archive Logs ......................... SUCCESS [ 4.460 s]
[INFO] Apache Hadoop Rumen ................................ SUCCESS [ 8.090 s]
[INFO] Apache Hadoop Gridmix .............................. SUCCESS [ 6.515 s]
[INFO] Apache Hadoop Data Join ............................ SUCCESS [ 4.848 s]
[INFO] Apache Hadoop Extras ............................... SUCCESS [ 4.475 s]
[INFO] Apache Hadoop Pipes ................................ SUCCESS [ 10.504 s]
[INFO] Apache Hadoop OpenStack support .................... SUCCESS [ 6.041 s]
[INFO] Apache Hadoop Amazon Web Services support .......... SUCCESS [03:07 min]
[INFO] Apache Hadoop Kafka Library support ................ SUCCESS [ 14.470 s]
[INFO] Apache Hadoop Azure support ........................ SUCCESS [ 18.661 s]
[INFO] Apache Hadoop Aliyun OSS support ................... SUCCESS [ 16.479 s]
[INFO] Apache Hadoop Scheduler Load Simulator ............. SUCCESS [ 7.093 s]
[INFO] Apache Hadoop Resource Estimator Service ........... SUCCESS [ 23.205 s]
[INFO] Apache Hadoop Azure Data Lake support .............. SUCCESS [ 9.833 s]
[INFO] Apache Hadoop Image Generation Tool ................ SUCCESS [ 4.874 s]
[INFO] Apache Hadoop Tools Dist ........................... SUCCESS [ 32.465 s]
[INFO] Apache Hadoop Tools ................................ SUCCESS [ 0.124 s]
[INFO] Apache Hadoop Client API ........................... SUCCESS [04:30 min]
[INFO] Apache Hadoop Client Runtime ....................... SUCCESS [04:01 min]
[INFO] Apache Hadoop Client Packaging Invariants .......... SUCCESS [ 1.758 s]
[INFO] Apache Hadoop Client Test Minicluster .............. SUCCESS [06:40 min]
[INFO] Apache Hadoop Client Packaging Invariants for Test . SUCCESS [ 0.320 s]
[INFO] Apache Hadoop Client Packaging Integration Tests ... SUCCESS [ 4.331 s]
[INFO] Apache Hadoop Distribution ......................... SUCCESS [01:37 min]
[INFO] Apache Hadoop Client Modules ....................... SUCCESS [ 0.113 s]
[INFO] Apache Hadoop Cloud Storage ........................ SUCCESS [ 1.141 s]
[INFO] Apache Hadoop Tencent COS Support .................. SUCCESS [ 14.005 s]
[INFO] Apache Hadoop Cloud Storage Project ................ SUCCESS [ 0.111 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:12 h
[INFO] Finished at: 2022-12-23T10:47:07+08:00
[INFO] ------------------------------------------------------------------------
生成的安装包在 hadoop-dist/target/
目录下 hadoop-3.3.4.tar.gz
为最终编译的产品包。
可选项
ISA-L Support
ISA-L (Intelligent Storage Acceleration Library) 是 Intel 开发的智能存储加速库,可以为 HDFS 提高性能,可以在 ARMv8(aarch64) 和 AMD64(x86_64) 架构上编译。
此组件在编译时会自动检测并添加支持,因此请在编译前按下述步骤安装 ISA-L 库,然后进行编译即可原生支持 ISA-L。
## 启用 PowerTools 仓库(如果提示命令不存在,则需要安装 dnf-plugins-core 后再次尝试)
sudo dnf config-manager --set-enabled powertools
## 安装依赖
sudo dnf install gcc make autoconf automake libtool git nasm
## 克隆源码
git clone https://github.com/intel/isa-l
## 编译
./autogen.sh
./configure --prefix=/usr --libdir=/usr/lib64
make
sudo make install
PMDK Support
PMDK(Persistent Memory Development Kit) 利用 PMDK 用户态编程库进行数据读写,减小用户态、内核态切换与文件系统开销,提高集群的读写性能。
PMDK 扩展与其他扩展不同,即便系统内检测到相关依赖库也不会默认编译支持,需要在编译时增加参数重新进行编译,增加支持,安装所需依赖。
amd64(x86_64) 可以使用发行版的包管理器进行安装:
## 只安装运行依赖 Runtime (部署机器上安装) sudo dnf install -y libpmem librpmem libpmemblk libpmemlog libpmemobj libpmempool pmempool ## 只安装开发套件 Development (编译机器上安装) sudo dnf install -y libpmem-devel librpmem-devel libpmemblk-devel libpmemlog-devel libpmemobj-devel libpmemobj++-devel libpmempool-devel
arm64(aarch64) 没有预构建包,需要手动编译:
## 安装编译依赖 sudo dnf install -y ndctl-devel daxctl-devel pandoc cmake gcc-c++ ## 克隆源码 git clone https://github.com/pmem/pmdk ## 编译 make sudo make install prefix=/usr
然后使用命令进行编译
mvn clean package -Pdist,native, -DskipTests -Dtar -Dmaven.javadoc-skip=true -Drequire.pmdk -X
编译后使用新安装包部署后重新执行检查
$ hadoop checknative
2022-12-23 10:55:38,656 INFO bzip2.Bzip2Factory: Successfully loaded & initialized native-bzip2 library system-native
2022-12-23 10:55:38,661 INFO zlib.ZlibFactory: Successfully loaded & initialized native-zlib library
2022-12-23 10:55:38,806 INFO nativeio.NativeIO: The native code was built with PMDK support, and PMDK libs were loaded successfully.
Native library checking:
hadoop: true /opt/hadoop-3.3.4/lib/native/libhadoop.so.1.0.0
zlib: true /lib64/libz.so.1
zstd : true /lib64/libzstd.so.1
bzip2: true /lib64/libbz2.so.1
openssl: true /usr/lib64/libcrypto.so.1.1
ISA-L: true /lib64/libisal.so.2
PMDK: true /usr/lib64/libpmem.so.1.0.0
常见问题
a) 如果提示部分库无法下载怎么处理?
可以尝试换用阿里云的 Maven 镜像仓库或者搭建自建代理,然后使用命令配置使用代理。
export MAVEN_OPTS="-DproxyHost=127.0.0.1 -DproxyPort=8080"
b) 编译报错 Protobuf compiler version 2.5.0 doesn't match library version
详细如下:
[INFO] --- hadoop-maven-plugins:3.2.0:cmake-compile (cmake-compile) @ hadoop-hdfs-native-client ---
...
[WARNING] Located all JNI components successfully.
[WARNING] CUSTOM_OPENSSL_PREFIX =
[WARNING] -- Performing Test THREAD_LOCAL_SUPPORTED
[WARNING] -- Performing Test THREAD_LOCAL_SUPPORTED - Success
[WARNING] CMake Error at /usr/share/cmake3/Modules/FindProtobuf.cmake:465 (file):
[WARNING] file STRINGS file "/usr/include/google/protobuf/stubs/common.h" cannot be
[WARNING] read.
[WARNING] Call Stack (most recent call first):
[WARNING] main/native/libhdfspp/CMakeLists.txt:45 (find_package)
[WARNING]
[WARNING]
[WARNING] CMake Error at /usr/share/cmake3/Modules/FindProtobuf.cmake:471 (math):
[WARNING] math cannot parse the expression: " / 1000000": syntax error, unexpected
[WARNING] exp_DIVIDE, expecting exp_PLUS or exp_MINUS or exp_OPENPARENT or exp_NUMBER
[WARNING] (2).
[WARNING] Call Stack (most recent call first):
[WARNING] main/native/libhdfspp/CMakeLists.txt:45 (find_package)
[WARNING]
[WARNING]
[WARNING] CMake Error at /usr/share/cmake3/Modules/FindProtobuf.cmake:472 (math):
[WARNING] math cannot parse the expression: " / 1000 % 1000": syntax error,
[WARNING] unexpected exp_DIVIDE, expecting exp_PLUS or exp_MINUS or exp_OPENPARENT or
[WARNING] exp_NUMBER (2).
[WARNING] Call Stack (most recent call first):
[WARNING] main/native/libhdfspp/CMakeLists.txt:45 (find_package)
[WARNING]
[WARNING]
[WARNING] CMake Error at /usr/share/cmake3/Modules/FindProtobuf.cmake:473 (math):
[WARNING] math cannot parse the expression: " % 1000": syntax error, unexpected
[WARNING] exp_MOD, expecting exp_PLUS or exp_MINUS or exp_OPENPARENT or exp_NUMBER
[WARNING] (2).
[WARNING] Call Stack (most recent call first):
[WARNING] main/native/libhdfspp/CMakeLists.txt:45 (find_package)
[WARNING]
[WARNING]
[WARNING] CMake Warning at /usr/share/cmake3/Modules/FindProtobuf.cmake:495 (message):
[WARNING] Protobuf compiler version 2.5.0 doesn't match library version
[WARNING] ERROR.ERROR.ERROR
[WARNING] Call Stack (most recent call first):
[WARNING] main/native/libhdfspp/CMakeLists.txt:45 (find_package)
[WARNING]
[WARNING]
[WARNING] -- Could NOT find GSASL (missing: GSASL_LIBRARIES GSASL_INCLUDE_DIR)
[WARNING] -- Performing Test THREAD_LOCAL_SUPPORTED
[WARNING] -- Performing Test THREAD_LOCAL_SUPPORTED - Success
[WARNING] -- Performing Test PROTOC_IS_COMPATIBLE
[WARNING] -- Performing Test PROTOC_IS_COMPATIBLE - Failed
[WARNING] CMake Warning at main/native/libhdfspp/CMakeLists.txt:86 (message):
[WARNING] WARNING: the Protocol Buffers Library and the Libhdfs++ Library must both
[WARNING] be compiled with the same (or compatible) compiler. Normally only the same
[WARNING] major versions of the same compiler are compatible with each other.
[WARNING]
[WARNING]
[WARNING] -- valgrind location: MEMORYCHECK_COMMAND-NOTFOUND
[WARNING] -- Using Cyrus SASL; link with /usr/lib64/libsasl2.so
这是因为没有按照要求安装依赖,导致缺失 protobuf-devel
,手动补充安装。
c) 编译报错 Package 'libtirpc', required by 'virtual:world', not found
详细如下:
[WARNING] CMake Warning (dev) in CMakeLists.txt:
[WARNING] No project() command is present. The top-level CMakeLists.txt file must
[WARNING] contain a literal, direct call to the project() command. Add a line of
[WARNING] code such as
[WARNING]
[WARNING] project(ProjectName)
[WARNING]
[WARNING] near the top of the file, but after cmake_minimum_required().
[WARNING]
[WARNING] CMake is pretending there is a "project(Project)" command on the first
[WARNING] line.
[WARNING] This warning is for project developers. Use -Wno-dev to suppress it.
[WARNING]
[WARNING] -- The C compiler identification is GNU 8.5.0
[WARNING] -- The CXX compiler identification is GNU 8.5.0
[WARNING] -- Detecting C compiler ABI info
[WARNING] -- Detecting C compiler ABI info - done
[WARNING] -- Check for working C compiler: /usr/bin/cc - skipped
[WARNING] -- Detecting C compile features
[WARNING] -- Detecting C compile features - done
[WARNING] -- Detecting CXX compiler ABI info
[WARNING] -- Detecting CXX compiler ABI info - done
[WARNING] -- Check for working CXX compiler: /usr/bin/c++ - skipped
[WARNING] -- Detecting CXX compile features
[WARNING] -- Detecting CXX compile features - done
[WARNING] -- Looking for pthread.h
[WARNING] -- Looking for pthread.h - found
[WARNING] -- Performing Test CMAKE_HAVE_LIBC_PTHREAD
[WARNING] -- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Failed
[WARNING] -- Looking for pthread_create in pthreads
[WARNING] -- Looking for pthread_create in pthreads - not found
[WARNING] -- Looking for pthread_create in pthread
[WARNING] -- Looking for pthread_create in pthread - found
[WARNING] -- Found Threads: TRUE
[WARNING] -- Found OpenSSL: /usr/lib64/libcrypto.so (found version "1.1.1k")
[WARNING] -- Checking for module 'libtirpc'
[WARNING] -- Package 'libtirpc', required by 'virtual:world', not found
[WARNING] -- Looking for dlopen in dl
[WARNING] -- Looking for dlopen in dl - found
[WARNING] -- Configuring done
[WARNING] CMake Error: The following variables are used in this project, but they are set to NOTFOUND.
[WARNING] Please set them or make sure they are set and tested correctly in the CMake files:
这是因为没有按照要求安装依赖,导致缺失 libtirpc-devel
,手动补充安装。
d) 使用命令 hadoop checknative
检查原生组件时存在组件报错 false
比如
$ hadoop checknative
Native library checking:
hadoop: true /opt/hadoop-3.3.4/lib/native/libhadoop.so.1.0.0
zlib: true /lib64/libz.so.1
zstd : true /lib64/libzstd.so.1
bzip2: true /lib64/libbz2.so.1
openssl: false Cannot load libcrypto.so (libcrypto.so: cannot open shared object file: No such file or directory)!
ISA-L: false Loading ISA-L failed: Failed to load libisal.so.2 (libisal.so.2: cannot open shared object file: No such file or directory)
PMDK: false The native code was built without PMDK support.
其中的组件及其对应的包名如下表:
Object Name | Package Name | Source Name |
---|---|---|
zlib | zlib-devel | / |
zstd | zstd-devel | / |
bzip2 | bzip2-devel | / |
openssl | openssl-devel | / |
ISA-L | / | https://github.com |
PMDK | / | https://pmem.io |
e) 执行 Java 版本检查时报错 Unable to load native library
具体报错如下图
Error occurred during initialization of VM
Unable to load native library: libnsl.so.1: cannot open shared object file: No such file or directory
当前环境存在依赖问题,可能是安装失败导致的,也可能是错误卸载某些软件导致的,可手动修复
sudo dnf install libnsl
export LC_ALL=en_US
附录
参考链接
本文由 柒 创作,采用 知识共享署名4.0
国际许可协议进行许可。
转载本站文章前请注明出处,文章作者保留所有权限。
最后编辑时间: 2023-09-21 10:05 AM