FreeBSD Bugzilla – Attachment 206104 Details for
Bug 231048
[new port] devel/hadoop3: Apache Map/Reduce Framework v3.2
Home
|
New
|
Browse
|
Search
|
[?]
|
Reports
|
Help
|
New Account
|
Log In
Remember
[x]
|
Forgot Password
Login:
[x]
CRF for hadoop 3.2
hadoop32.shar (text/plain), 130.53 KB, created by
Johannes Jost Meixner
on 2019-07-27 20:20:32 UTC
(
hide
)
Description:
CRF for hadoop 3.2
Filename:
MIME Type:
Creator:
Johannes Jost Meixner
Created:
2019-07-27 20:20:32 UTC
Size:
130.53 KB
patch
obsolete
># This is a shell archive. Save it in a file, remove anything before ># this line, and then unpack it by entering "sh file". Note, it may ># create directories; files and directories will be owned by you and ># have default permissions. ># ># This archive contains: ># ># hadoop3 ># hadoop3/pkg-descr ># hadoop3/files ># hadoop3/files/kms-env.sh.in ># hadoop3/files/datanode.in ># hadoop3/files/secondarynamenode.in ># hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_CMakeLists.txt ># hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_src_lib_SpillInfo.cc ># hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_src_util_StringUtil.cc ># hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_src_handler_MCollectorOutputHandler.cc ># hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_test_TestSort.cc ># hadoop3/files/nodemanager.in ># hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_src_lib_MapOutputCollector.cc ># hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_src_codec_SnappyCodec.cc ># hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_src_lib_IFile.h ># hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-core_src_main_java_org_apache_hadoop_mapred_TaskLog.java ># hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_src_lib_IFile.cc ># hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_test_lib_TestKVBuffer.cc ># hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_test_lib_TestMemoryBlock.cc ># hadoop3/files/patch-hadoop-hdfs-project_hadoop-hdfs-native-client_src_main_native_libhdfspp_third__party_tr2_optional.hpp ># hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_java_org_apache_hadoop_mapred_nativetask_INativeComparable.java ># hadoop3/files/resourcemanager.in ># hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_test_TestIFile.cc ># hadoop3/files/journalnode.in ># hadoop3/files/hadoop-layout.sh.in ># hadoop3/files/namenode.in ># hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_src_handler_CombineHandler.cc ># hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_src_lib_Iterator.cc ># hadoop3/files/patch-hadoop-hdfs-project_hadoop-hdfs-native-client_pom.xml ># hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-core_src_main_java_org_apache_hadoop_mapreduce_util_ProcessTree.java ># hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_src_codec_BlockCodec.cc ># hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_src_lib_commons.h ># hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_src_util_WritableUtils.cc ># hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_test_TestCompressions.cc ># hadoop3/files/patch-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-nodemanager_src_main_native_container-executor_test_test__configuration.cc ># hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_src_lib_Buffers.h ># hadoop3/files/patch-hadoop-common-project-hadoop-common-src-main-java-org-apache-hadoop-util-StringUtils.java ># hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_src_lib_Buffers.cc ># hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_src_codec_Lz4Codec.cc ># hadoop3/files/httpfs-env.sh.in ># hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_test_lib_TestMemBlockIterator.cc ># hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_test_lib_TestPartitionBucket.cc ># hadoop3/files/patch-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-nodemanager_src_main_native_container-executor_impl_configuration.c ># hadoop3/files/historyserver.in ># hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_src_handler_BatchHandler.h ># hadoop3/files/patch-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-nodemanager_pom.xml ># hadoop3/files/patch-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-nodemanager_src_main_java_org_apache_hadoop_yarn_server_nodemanager_DefaultContainerExecutor.java ># hadoop3/files/patch-hadoop-common-project_hadoop-common_src_main_java_org_apache_hadoop_util_Shell.java ># hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_src_lib_NativeObjectFactory.cc ># hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_test_TestPrimitives.cc ># hadoop3/files/patch-hadoop-common-project_hadoop-common_src_main_java_org_apache_hadoop_io_nativeio_SharedFileDescriptorFactory.java ># hadoop3/files/zkfc.in ># hadoop3/files/patch-hadoop-hdfs-project_hadoop-hdfs-native-client_src_main_native_libhdfspp_CMakeLists.txt ># hadoop3/files/webappproxyserver.in ># hadoop3/Makefile ># hadoop3/.Makefile.swp ># hadoop3/distinfo ># hadoop3/pkg-plist ># >echo c - hadoop3 >mkdir -p hadoop3 > /dev/null 2>&1 >echo x - hadoop3/pkg-descr >sed 's/^X//' >hadoop3/pkg-descr << '8c327185cc3098b8f790f29095cad92a' >XThe Apache Hadoop software library is a framework that allows for the >Xdistributed processing of large data sets across clusters of computers >Xusing a simple programming model. >X >XWWW: http://hadoop.apache.org/ >8c327185cc3098b8f790f29095cad92a >echo c - hadoop3/files >mkdir -p hadoop3/files > /dev/null 2>&1 >echo x - hadoop3/files/kms-env.sh.in >sed 's/^X//' >hadoop3/files/kms-env.sh.in << 'b2ce75549e9b8edd6e9e7e03315395e8' >X# $FreeBSD$ >X >Xexport KMS_LOG=/var/log/hadoop >Xexport KMS_TEMP=/var/tmp >b2ce75549e9b8edd6e9e7e03315395e8 >echo x - hadoop3/files/datanode.in >sed 's/^X//' >hadoop3/files/datanode.in << '045a79067b6d6c89d8ca2c2ae11a8d0c' >X#!/bin/sh >X# >X# $FreeBSD$ >X# >X# PROVIDE: datanode >X# REQUIRE: LOGIN >X# KEYWORD: shutdown >X# >X# datanode_enable (bool): Set to NO by default. >X# Set it to YES to enable datanode. >X >X. /etc/rc.subr >X >Xexport PATH=${PATH}:%%LOCALBASE%%/bin >Xname=datanode >Xrcvar=datanode_enable >Xpidfile=%%HADOOP_RUNDIR%%/hadoop-%%HDFS_USER%%-${name}.pid >X >Xload_rc_config "${name}" >X >X: ${datanode_enable:=NO} >X: ${datanode_user:=%%HDFS_USER%%} >X >Xcommand="%%PREFIX%%/sbin/hadoop-daemon.sh" >Xcommand_interpreter_execution="%%JAVA_HOME%%/bin/java" >Xcommand_args='--config %%ETCDIR%% start datanode' >X >Xstart_postcmd="start_postcmd" >Xstop_cmd=datanode_stop >Xstatus_precmd=find_pid >X >Xstart_postcmd () { >X rc_pid=$(check_pidfile ${pidfile} %%JAVA_HOME%%/bin/java) >X if [ -n "$rc_pid" ]; then >X protect -p $rc_pid >X fi >X} >X >Xdatanode_stop () { >X su -m ${datanode_user} -c "${command} --config %%ETCDIR%% stop datanode" >X} >X >Xfind_pid () { >X rc_pid=$(check_pidfile $pidfile $command_interpreter_execution) >X} >X >Xrun_rc_command "$1" >045a79067b6d6c89d8ca2c2ae11a8d0c >echo x - hadoop3/files/secondarynamenode.in >sed 's/^X//' >hadoop3/files/secondarynamenode.in << '84a1e61aea086c862879312e885ec73d' >X#!/bin/sh >X# >X# $FreeBSD$ >X# >X# PROVIDE: secondarynamenode >X# REQUIRE: LOGIN >X# KEYWORD: shutdown >X# >X# secondarynamenode_enable (bool): Set to NO by default. >X# Set it to YES to enable secondarynamenode. >X >X. /etc/rc.subr >X >Xexport PATH=${PATH}:%%LOCALBASE%%/bin >Xname=secondarynamenode >Xrcvar=secondarynamenode_enable >X >Xload_rc_config "${name}" >X >X: ${secondarynamenode_enable:=NO} >X: ${secondarynamenode_user:=%%HDFS_USER%%} >X >Xcommand="%%PREFIX%%/sbin/hadoop-daemon.sh" >Xcommand_args='--config %%ETCDIR%% start secondarynamenode' >X >Xstop_cmd=secondarynamenode_stop >X >Xsecondarynamenode_stop () { >X su -m ${secondarynamenode_user} -c "${command} --config %%ETCDIR%% stop secondarynamenode" >X} >X >Xrun_rc_command "$1" >84a1e61aea086c862879312e885ec73d >echo x - hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_CMakeLists.txt >sed 's/^X//' >hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_CMakeLists.txt << 'fa04ec44091fe0bd1ce43e3fac197c19' >X--- hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-nativetask/src/CMakeLists.txt.orig 2018-03-21 17:57:56 UTC >X+++ hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-nativetask/src/CMakeLists.txt >X@@ -27,6 +27,7 @@ set(GTEST_SRC_DIR ${CMAKE_SOURCE_DIR}/../../../../hado >X # Add extra compiler and linker flags. >X # -Wno-sign-compare >X hadoop_add_compiler_flags("-DNDEBUG -DSIMPLE_MEMCPY -fno-strict-aliasing -fsigned-char") >X+hadoop_add_linker_flags("-lexecinfo") >X >X # Source location. >X set(SRC main/native) >X@@ -45,7 +46,6 @@ include(CheckIncludeFiles) >X >X check_include_files(fcntl.h HAVE_FCNTL_H) >X check_include_files(malloc.h HAVE_MALLOC_H) >X-check_include_files(mach/mach.h HAVE_MACH_MACH_H) >X check_include_files(memory.h HAVE_MEMORY_H) >X check_include_files(stddef.h HAVE_STDDEF_H) >X check_include_files(stdint.h HAVE_STDINT_H) >fa04ec44091fe0bd1ce43e3fac197c19 >echo x - hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_src_lib_SpillInfo.cc >sed 's/^X//' >hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_src_lib_SpillInfo.cc << '03fe1f57b45fae9c9a6a5f3d292ea2e7' >X--- hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-nativetask/src/main/native/src/lib/SpillInfo.cc.orig 2018-10-18 18:38:39 UTC >X+++ hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-nativetask/src/main/native/src/lib/SpillInfo.cc >X@@ -58,7 +58,7 @@ void SingleSpillInfo::writeSpillInfo(const std::string >X appendBuffer.flush(); >X uint32_t chsum = dest.getChecksum(); >X #ifdef SPILLRECORD_CHECKSUM_UINT >X- chsum = bswap(chsum); >X+ chsum = bswap32(chsum); >X fout->write(&chsum, sizeof(uint32_t)); >X #else >X uint64_t wtchsum = bswap64((uint64_t)chsum); >03fe1f57b45fae9c9a6a5f3d292ea2e7 >echo x - hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_src_util_StringUtil.cc >sed 's/^X//' >hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_src_util_StringUtil.cc << '9962fd7d251af64ea453372214d61fae' >X--- hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-nativetask/src/main/native/src/util/StringUtil.cc.orig 2019-04-21 10:28:49 UTC >X+++ hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-nativetask/src/main/native/src/util/StringUtil.cc >X@@ -36,19 +36,19 @@ string StringUtil::ToString(uint32_t v) { >X >X string StringUtil::ToString(int64_t v) { >X char tmp[32]; >X- snprintf(tmp, 32, "%"PRId64, v); >X+ snprintf(tmp, 32, "%" PRId64, v); >X return tmp; >X } >X >X string StringUtil::ToString(int64_t v, char pad, int64_t len) { >X char tmp[32]; >X- snprintf(tmp, 32, "%%%c%"PRId64""PRId64, pad, len); >X+ snprintf(tmp, 32, "%%%c%" PRId64"" PRId64, pad, len); >X return Format(tmp, v); >X } >X >X string StringUtil::ToString(uint64_t v) { >X char tmp[32]; >X- snprintf(tmp, 32, "%"PRIu64, v); >X+ snprintf(tmp, 32, "%" PRIu64, v); >X return tmp; >X } >X >9962fd7d251af64ea453372214d61fae >echo x - hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_src_handler_MCollectorOutputHandler.cc >sed 's/^X//' >hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_src_handler_MCollectorOutputHandler.cc << '55e6d8440134aba8c768b7686bcd1ad9' >X--- hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-nativetask/src/main/native/src/handler/MCollectorOutputHandler.cc.orig 2018-10-18 18:38:39 UTC >X+++ hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-nativetask/src/main/native/src/handler/MCollectorOutputHandler.cc >X@@ -74,9 +74,9 @@ void MCollectorOutputHandler::handleInput(ByteBuffer & >X } >X >X if (_endium == LARGE_ENDIUM) { >X- kvBuffer->partitionId = bswap(kvBuffer->partitionId); >X- kvBuffer->buffer.keyLength = bswap(kvBuffer->buffer.keyLength); >X- kvBuffer->buffer.valueLength = bswap(kvBuffer->buffer.valueLength); >X+ kvBuffer->partitionId = bswap32(kvBuffer->partitionId); >X+ kvBuffer->buffer.keyLength = bswap32(kvBuffer->buffer.keyLength); >X+ kvBuffer->buffer.valueLength = bswap32(kvBuffer->buffer.valueLength); >X } >X >X uint32_t kvLength = kvBuffer->buffer.length(); >55e6d8440134aba8c768b7686bcd1ad9 >echo x - hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_test_TestSort.cc >sed 's/^X//' >hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_test_TestSort.cc << 'fcd858a549e1e3da0a0112216c5129c0' >X--- hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-nativetask/src/main/native/test/TestSort.cc.orig 2018-10-18 18:38:39 UTC >X+++ hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-nativetask/src/main/native/test/TestSort.cc >X@@ -121,7 +121,7 @@ static int compare_offset2(const void * plh, const voi >X KVBuffer * rhb = (KVBuffer*)get_position(*(uint32_t*)prh); >X >X uint32_t minlen = std::min(lhb->keyLength, rhb->keyLength); >X- int64_t ret = fmemcmp(lhb->content, rhb->content, minlen); >X+ int64_t ret = memcmp(lhb->content, rhb->content, minlen); >X if (ret) { >X return ret; >X } >X@@ -139,7 +139,7 @@ class CompareOffset2 { >X KVBuffer * rhb = (KVBuffer*)get_position(rhs); >X >X uint32_t minlen = std::min(lhb->keyLength, rhb->keyLength); >X- int64_t ret = fmemcmp(lhb->content, rhb->content, minlen); >X+ int64_t ret = memcmp(lhb->content, rhb->content, minlen); >X if (ret) { >X return ret; >X } >X@@ -158,7 +158,7 @@ class OffsetLessThan2 { >X KVBuffer * rhb = (KVBuffer*)get_position(rhs); >X >X uint32_t minlen = std::min(lhb->keyLength, rhb->keyLength); >X- int64_t ret = fmemcmp(lhb->content, rhb->content, minlen); >X+ int64_t ret = memcmp(lhb->content, rhb->content, minlen); >X return ret < 0 || (ret == 0 && (lhb->keyLength < rhb->keyLength)); >X } >X }; >fcd858a549e1e3da0a0112216c5129c0 >echo x - hadoop3/files/nodemanager.in >sed 's/^X//' >hadoop3/files/nodemanager.in << '6b5ee6be9f17ed93db846658be80a055' >X#!/bin/sh >X# >X# $FreeBSD$ >X# >X# PROVIDE: nodemanager >X# REQUIRE: LOGIN >X# KEYWORD: shutdown >X# >X# nodemanager_enable (bool): Set to NO by default. >X# Set it to YES to enable resourcemanager. >X >X. /etc/rc.subr >X >Xexport PATH=${PATH}:%%LOCALBASE%%/bin >Xname=nodemanager >Xrcvar=nodemanager_enable >Xpidfile=%%HADOOP_RUNDIR%%/yarn-yarn-${name}.pid >X >Xload_rc_config "${name}" >X >X: ${nodemanager_enable:=NO} >X: ${nodemanager_user:=%%MAPRED_USER%%} >X >Xcommand="%%PREFIX%%/sbin/yarn-daemon.sh" >Xcommand_interpreter_execution="%%JAVA_HOME%%/bin/java" >Xcommand_args='--config %%ETCDIR%% start nodemanager' >X >Xstart_postcmd="start_postcmd" >Xstop_cmd=nodemanager_stop >Xstatus_precmd=find_pid >X >Xstart_postcmd () { >X rc_pid=$(check_pidfile ${pidfile} %%JAVA_HOME%%/bin/java) >X if [ -n "$rc_pid" ]; then >X protect -p $rc_pid >X fi >X} >X >Xnodemanager_stop () { >X su -m ${nodemanager_user} -c "${command} --config %%ETCDIR%% stop nodemanager" >X} >X >Xfind_pid () { >X rc_pid=$(check_pidfile $pidfile $command_interpreter_execution) >X} >X >Xrun_rc_command "$1" >6b5ee6be9f17ed93db846658be80a055 >echo x - hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_src_lib_MapOutputCollector.cc >sed 's/^X//' >hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_src_lib_MapOutputCollector.cc << 'b8b15274f0b49749d848b058b7f9677f' >X--- hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-nativetask/src/main/native/src/lib/MapOutputCollector.cc.orig 2019-04-21 10:25:54 UTC >X+++ hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-nativetask/src/main/native/src/lib/MapOutputCollector.cc >X@@ -302,10 +302,10 @@ void MapOutputCollector::middleSpill(const std::string >X uint64_t spillTime = timer.now() - timer.last() - metrics.sortTime; >X >X const uint64_t M = 1000000; // million >X- LOG("%s-spill: { id: %d, collect: %"PRIu64" ms, " >X- "in-memory sort: %"PRIu64" ms, in-memory records: %"PRIu64", " >X- "merge&spill: %"PRIu64" ms, uncompressed size: %"PRIu64", " >X- "real size: %"PRIu64" path: %s }", >X+ LOG("%s-spill: { id: %d, collect: %" PRIu64" ms, " >X+ "in-memory sort: %" PRIu64" ms, in-memory records: %" PRIu64", " >X+ "merge&spill: %" PRIu64" ms, uncompressed size: %" PRIu64", " >X+ "real size: %" PRIu64" path: %s }", >X final ? "Final" : "Mid", >X _spillInfos.getSpillCount(), >X collecttime / M, >X@@ -370,10 +370,10 @@ void MapOutputCollector::finalSpill(const std::string >X writer->getStatistics(outputSize, realOutputSize, recordCount); >X >X const uint64_t M = 1000000; // million >X- LOG("Final-merge-spill: { id: %d, in-memory sort: %"PRIu64" ms, " >X- "in-memory records: %"PRIu64", merge&spill: %"PRIu64" ms, " >X- "records: %"PRIu64", uncompressed size: %"PRIu64", " >X- "real size: %"PRIu64" path: %s }", >X+ LOG("Final-merge-spill: { id: %d, in-memory sort: %" PRIu64" ms, " >X+ "in-memory records: %" PRIu64", merge&spill: %" PRIu64" ms, " >X+ "records: %" PRIu64", uncompressed size: %" PRIu64", " >X+ "real size: %" PRIu64" path: %s }", >X _spillInfos.getSpillCount(), >X metrics.sortTime / M, >X metrics.recordCount, >b8b15274f0b49749d848b058b7f9677f >echo x - hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_src_codec_SnappyCodec.cc >sed 's/^X//' >hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_src_codec_SnappyCodec.cc << 'bb060687ae7ce8793c7d2f2c595caf3c' >X--- hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-nativetask/src/main/native/src/codec/SnappyCodec.cc.orig 2018-10-18 18:38:39 UTC >X+++ hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-nativetask/src/main/native/src/codec/SnappyCodec.cc >X@@ -37,8 +37,8 @@ void SnappyCompressStream::compressOneBlock(const void >X snappy_status ret = snappy_compress((const char*)buff, length, _tempBuffer + 8, >X &compressedLength); >X if (ret == SNAPPY_OK) { >X- ((uint32_t*)_tempBuffer)[0] = bswap(length); >X- ((uint32_t*)_tempBuffer)[1] = bswap((uint32_t)compressedLength); >X+ ((uint32_t*)_tempBuffer)[0] = bswap32(length); >X+ ((uint32_t*)_tempBuffer)[1] = bswap32((uint32_t)compressedLength); >X _stream->write(_tempBuffer, compressedLength + 8); >X _compressedBytesWritten += (compressedLength + 8); >X } else if (ret == SNAPPY_INVALID_INPUT) { >bb060687ae7ce8793c7d2f2c595caf3c >echo x - hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_src_lib_IFile.h >sed 's/^X//' >hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_src_lib_IFile.h << 'b9494243e3f4ab22afe84fa9f05845ad' >X--- hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-nativetask/src/main/native/src/lib/IFile.h.orig 2018-10-18 18:38:39 UTC >X+++ hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-nativetask/src/main/native/src/lib/IFile.h >X@@ -74,7 +74,7 @@ class IFileReader { (public) >X keyLen = WritableUtils::ReadVInt(kvbuff, len); >X break; >X case BytesType: >X- keyLen = bswap(*(uint32_t*)kvbuff); >X+ keyLen = bswap32(*(uint32_t*)kvbuff); >X len = 4; >X break; >X default: >X@@ -89,7 +89,7 @@ class IFileReader { (public) >X _valuePos = vbuff + len; >X break; >X case BytesType: >X- _valueLen = bswap(*(uint32_t*)vbuff); >X+ _valueLen = bswap32(*(uint32_t*)vbuff); >X _valuePos = vbuff + 4; >X break; >X default: >b9494243e3f4ab22afe84fa9f05845ad >echo x - hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-core_src_main_java_org_apache_hadoop_mapred_TaskLog.java >sed 's/^X//' >hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-core_src_main_java_org_apache_hadoop_mapred_TaskLog.java << 'e2f3462da1aa613e68cfb64ca3652356' >X--- hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/src/main/java/org/apache/hadoop/mapred/TaskLog.java.orig 2019-04-21 12:21:44 UTC >X+++ hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/src/main/java/org/apache/hadoop/mapred/TaskLog.java >X@@ -546,7 +546,7 @@ public class TaskLog { >X mergedCmd.append("("); >X } else if(ProcessTree.isSetsidAvailable && useSetsid && >X !Shell.WINDOWS) { >X- mergedCmd.append("exec setsid "); >X+ mergedCmd.append("exec ssid "); >X } else { >X mergedCmd.append("exec "); >X } >e2f3462da1aa613e68cfb64ca3652356 >echo x - hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_src_lib_IFile.cc >sed 's/^X//' >hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_src_lib_IFile.cc << 'c5c9f9640be97e553829d47fa8711679' >X--- hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-nativetask/src/main/native/src/lib/IFile.cc.orig 2018-10-18 18:38:39 UTC >X+++ hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-nativetask/src/main/native/src/lib/IFile.cc >X@@ -60,7 +60,7 @@ bool IFileReader::nextPartition() { >X if (4 != _stream->readFully(&chsum, 4)) { >X THROW_EXCEPTION(IOException, "read ifile checksum failed"); >X } >X- uint32_t actual = bswap(chsum); >X+ uint32_t actual = bswap32(chsum); >X uint32_t expect = _source->getChecksum(); >X if (actual != expect) { >X THROW_EXCEPTION_EX(IOException, "read ifile checksum not match, actual %x expect %x", actual, >X@@ -130,7 +130,7 @@ void IFileWriter::endPartition() { >X } >X >X uint32_t chsum = _dest->getChecksum(); >X- chsum = bswap(chsum); >X+ chsum = bswap32(chsum); >X _stream->write(&chsum, sizeof(chsum)); >X _stream->flush(); >X IFileSegment * info = &(_spillFileSegments[_spillFileSegments.size() - 1]); >c5c9f9640be97e553829d47fa8711679 >echo x - hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_test_lib_TestKVBuffer.cc >sed 's/^X//' >hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_test_lib_TestKVBuffer.cc << '1a0ac08f8898a2996f5c88ec877432dc' >X--- hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-nativetask/src/main/native/test/lib/TestKVBuffer.cc.orig 2018-10-18 18:38:39 UTC >X+++ hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-nativetask/src/main/native/test/lib/TestKVBuffer.cc >X@@ -43,8 +43,8 @@ TEST(KVBuffer, test) { >X ASSERT_EQ(8, kv1->getKey() - buff); >X ASSERT_EQ(strlen(KEY) + 8, kv1->getValue() - buff); >X >X- kv1->keyLength = bswap(kv1->keyLength); >X- kv1->valueLength = bswap(kv1->valueLength); >X+ kv1->keyLength = bswap32(kv1->keyLength); >X+ kv1->valueLength = bswap32(kv1->valueLength); >X >X ASSERT_EQ(8, kv1->headerLength()); >X ASSERT_EQ(strlen(KEY) + strlen(VALUE) + 8, kv1->lengthConvertEndium()); >1a0ac08f8898a2996f5c88ec877432dc >echo x - hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_test_lib_TestMemoryBlock.cc >sed 's/^X//' >hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_test_lib_TestMemoryBlock.cc << 'e33323514b61d91ae4a3a0e7b01103b1' >X--- hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-nativetask/src/main/native/test/lib/TestMemoryBlock.cc.orig 2018-10-18 18:38:39 UTC >X+++ hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-nativetask/src/main/native/test/lib/TestMemoryBlock.cc >X@@ -85,17 +85,17 @@ TEST(MemoryBlock, sort) { >X medium->keyLength = 4; >X medium->valueLength = 4; >X uint32_t * mediumKey = (uint32_t *)medium->getKey(); >X- *mediumKey = bswap(MEDIUM); >X+ *mediumKey = bswap32(MEDIUM); >X >X small->keyLength = 4; >X small->valueLength = 4; >X uint32_t * smallKey = (uint32_t *)small->getKey(); >X- *smallKey = bswap(SMALL); >X+ *smallKey = bswap32(SMALL); >X >X big->keyLength = 4; >X big->valueLength = 4; >X uint32_t * bigKey = (uint32_t *)big->getKey(); >X- *bigKey = bswap(BIG); >X+ *bigKey = bswap32(BIG); >X >X ComparatorPtr bytesComparator = NativeTask::get_comparator(BytesType, NULL); >X block.sort(CPPSORT, bytesComparator); >e33323514b61d91ae4a3a0e7b01103b1 >echo x - hadoop3/files/patch-hadoop-hdfs-project_hadoop-hdfs-native-client_src_main_native_libhdfspp_third__party_tr2_optional.hpp >sed 's/^X//' >hadoop3/files/patch-hadoop-hdfs-project_hadoop-hdfs-native-client_src_main_native_libhdfspp_third__party_tr2_optional.hpp << '1b09dd0397506765cd03c73fca45ab05' >X--- hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/third_party/tr2/optional.hpp.orig 2019-04-21 09:03:51 UTC >X+++ hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/third_party/tr2/optional.hpp >X@@ -199,10 +199,10 @@ template <class T> inline constexpr typename std::remo >X #if defined NDEBUG >X # define TR2_OPTIONAL_ASSERTED_EXPRESSION(CHECK, EXPR) (EXPR) >X #elif defined __clang__ || defined __GNU_LIBRARY__ >X-# define TR2_OPTIONAL_ASSERTED_EXPRESSION(CHECK, EXPR) ((CHECK) ? (EXPR) : (fail(#CHECK, __FILE__, __LINE__), (EXPR))) >X- inline void fail(const char* expr, const char* file, int line) >X+# define TR2_OPTIONAL_ASSERTED_EXPRESSION(CHECK, EXPR) ((CHECK) ? (EXPR) : (fail(#CHECK, __FILE__, __func__, __LINE__), (EXPR))) >X+ inline void fail(const char* expr, const char* file, const char* func, int line) >X { >X- __assert(expr, file, line); >X+ __assert(file, func, line, expr); >X } >X #elif defined __GNUC__ >X # define TR2_OPTIONAL_ASSERTED_EXPRESSION(CHECK, EXPR) ((CHECK) ? (EXPR) : (fail(#CHECK, __FILE__, __LINE__), (EXPR))) >1b09dd0397506765cd03c73fca45ab05 >echo x - hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_java_org_apache_hadoop_mapred_nativetask_INativeComparable.java >sed 's/^X//' >hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_java_org_apache_hadoop_mapred_nativetask_INativeComparable.java << '364dd0fc27768e159a719a9a4b4fc1b8' >X--- hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-nativetask/src/main/java/org/apache/hadoop/mapred/nativetask/INativeComparable.java.orig 2018-10-18 18:38:39 UTC >X+++ hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-nativetask/src/main/java/org/apache/hadoop/mapred/nativetask/INativeComparable.java >X@@ -42,8 +42,8 @@ import org.apache.hadoop.classification.InterfaceStabi >X * <code> >X * int HivePlatform::HiveKeyComparator(const char * src, uint32_t srcLength, >X * const char * dest, uint32_t destLength) { >X- * uint32_t sl = bswap(*(uint32_t*)src); >X- * uint32_t dl = bswap(*(uint32_t*)dest); >X+ * uint32_t sl = bswap32(*(uint32_t*)src); >X+ * uint32_t dl = bswap32(*(uint32_t*)dest); >X * return NativeObjectFactory::BytesComparator(src + 4, sl, dest + 4, dl); >X * } >X * </code> >364dd0fc27768e159a719a9a4b4fc1b8 >echo x - hadoop3/files/resourcemanager.in >sed 's/^X//' >hadoop3/files/resourcemanager.in << '18af4d0efbf53e1f356f9b2ae93748dd' >X#!/bin/sh >X# >X# $FreeBSD$ >X# >X# PROVIDE: resourcemanager >X# REQUIRE: LOGIN >X# KEYWORD: shutdown >X# >X# resourcemanager_enable (bool): Set to NO by default. >X# Set it to YES to enable resourcemanager. >X >X. /etc/rc.subr >X >Xexport PATH=${PATH}:%%LOCALBASE%%/bin >Xname=resourcemanager >Xrcvar=resourcemanager_enable >X >Xload_rc_config "${name}" >X >X: ${resourcemanager_enable:=NO} >X: ${resourcemanager_user:=%%MAPRED_USER%%} >X >Xcommand="%%PREFIX%%/sbin/yarn-daemon.sh" >Xcommand_interpreter_execution="%%JAVA_HOME%%/bin/java" >Xcommand_args='--config %%ETCDIR%% start resourcemanager' >X >Xstop_cmd=resourcemanager_stop >Xstart_postcmd="start_postcmd" >Xstatus_precmd=find_pid >X >Xresourcemanager_stop () { >X su -m ${resourcemanager_user} -c "${command} --config %%ETCDIR%% stop resourcemanager" >X} >X >Xfind_pid () { >X rc_pid=$(check_pidfile $pidfile $command_interpreter_execution) >X} >X >Xrun_rc_command "$1" >18af4d0efbf53e1f356f9b2ae93748dd >echo x - hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_test_TestIFile.cc >sed 's/^X//' >hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_test_TestIFile.cc << '8567d6582631be97914a2101b836cc62' >X--- hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-nativetask/src/main/native/test/TestIFile.cc.orig 2018-10-18 18:38:39 UTC >X+++ hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-nativetask/src/main/native/test/TestIFile.cc >X@@ -190,7 +190,7 @@ TEST(IFile, TestGlibCBug) { >X reader->nextPartition(); >X uint32_t index = 0; >X while (NULL != (key = reader->nextKey(length))) { >X- int32_t realKey = (int32_t)bswap(*(uint32_t *)(key)); >X+ int32_t realKey = (int32_t)bswap32(*(uint32_t *)(key)); >X ASSERT_LT(index, 5); >X ASSERT_EQ(expect[index], realKey); >X index++; >8567d6582631be97914a2101b836cc62 >echo x - hadoop3/files/journalnode.in >sed 's/^X//' >hadoop3/files/journalnode.in << '3ff8a670256b6ee51b628d0ef42ab36d' >X#!/bin/sh >X# >X# $FreeBSD$ >X# >X# PROVIDE: journalnode >X# REQUIRE: LOGIN >X# KEYWORD: shutdown >X# >X# journalnode_enable (bool): Set to NO by default. >X# Set it to YES to enable journalnode. >X >X. /etc/rc.subr >X >Xexport PATH=${PATH}:%%LOCALBASE%%/bin >Xname=journalnode >Xrcvar=journalnode_enable >X >Xload_rc_config "${name}" >X >X: ${journalnode_enable:=NO} >X: ${journalnode_user:=%%HDFS_USER%%} >X >Xcommand="%%PREFIX%%/sbin/hadoop-daemon.sh" >Xcommand_interpreter_execution="%%JAVA_HOME%%/bin/java" >Xcommand_args='--config %%ETCDIR%% start journalnode' >X >Xstop_cmd=journalnode_stop >Xstatus_precmd=find_pid >X >Xjournalnode_stop () { >X su -m ${journalnode_user} -c "${command} --config %%ETCDIR%% stop journalnode" >X} >X >Xfind_pid () { >X rc_pid=$(check_pidfile $pidfile $command_interpreter_execution) >X} >X >Xrun_rc_command "$1" >3ff8a670256b6ee51b628d0ef42ab36d >echo x - hadoop3/files/hadoop-layout.sh.in >sed 's/^X//' >hadoop3/files/hadoop-layout.sh.in << '9d97ebd9c784e7ae9556b2d90e0769b5' >Xexport JAVA_HOME=${JAVA_HOME:-%%JAVA_HOME%%} >Xexport HADOOP_PREFIX=%%PREFIX%% >Xexport HADOOP_CONF_DIR=%%ETCDIR%% >Xexport HADOOP_LOG_DIR=%%HADOOP_LOGDIR%% >Xexport HADOOP_PID_DIR=%%HADOOP_RUNDIR%% >Xexport HADOOP_IDENT_STRING=hdfs >X >Xexport YARN_LOG_DIR=%%HADOOP_LOGDIR%% >Xexport YARN_PID_DIR=%%HADOOP_RUNDIR%% >Xexport YARN_IDENT_STRING=yarn >X >Xexport HADOOP_MAPRED_LOG_DIR=%%HADOOP_LOGDIR%% >Xexport HADOOP_MAPRED_PID_DIR=%%HADOOP_RUNDIR%% >Xexport HADOOP_MAPRED_IDENT_STRING=mapred >9d97ebd9c784e7ae9556b2d90e0769b5 >echo x - hadoop3/files/namenode.in >sed 's/^X//' >hadoop3/files/namenode.in << '0fc711d967e31e9dd8b07f1df17dac1c' >X#!/bin/sh >X# >X# $FreeBSD$ >X# >X# PROVIDE: namenode >X# REQUIRE: LOGIN >X# KEYWORD: shutdown >X# >X# namenode_enable (bool): Set to NO by default. >X# Set it to YES to enable namenode. >X >X. /etc/rc.subr >X >Xexport PATH=${PATH}:%%LOCALBASE%%/bin >Xname=namenode >Xrcvar=namenode_enable >X >Xload_rc_config "${name}" >X >X: ${namenode_enable:=NO} >X: ${namenode_user:=%%HDFS_USER%%} >X >Xcommand="%%PREFIX%%/sbin/hadoop-daemon.sh" >Xcommand_interpreter_execution="%%JAVA_HOME%%/bin/java" >Xcommand_args='--config %%ETCDIR%% start namenode' >X >Xstop_cmd=namenode_stop >Xstatus_precmd=find_pid >X >Xnamenode_stop () { >X su -m ${namenode_user} -c "${command} --config %%ETCDIR%% stop namenode" >X} >X >Xfind_pid () { >X rc_pid=$(check_pidfile $pidfile $command_interpreter_execution) >X} >X >Xrun_rc_command "$1" >0fc711d967e31e9dd8b07f1df17dac1c >echo x - hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_src_handler_CombineHandler.cc >sed 's/^X//' >hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_src_handler_CombineHandler.cc << 'd8ab517e619f5b5981ebba7fce2a3751' >X--- hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-nativetask/src/main/native/src/handler/CombineHandler.cc.orig 2018-10-18 18:38:39 UTC >X+++ hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-nativetask/src/main/native/src/handler/CombineHandler.cc >X@@ -48,8 +48,8 @@ uint32_t CombineHandler::feedDataToJavaInWritableSeria >X >X if (_kvCached) { >X uint32_t kvLength = _key.outerLength + _value.outerLength + KVBuffer::headerLength(); >X- outputInt(bswap(_key.outerLength)); >X- outputInt(bswap(_value.outerLength)); >X+ outputInt(bswap32(_key.outerLength)); >X+ outputInt(bswap32(_value.outerLength)); >X outputKeyOrValue(_key, _kType); >X outputKeyOrValue(_value, _vType); >X >X@@ -73,8 +73,8 @@ uint32_t CombineHandler::feedDataToJavaInWritableSeria >X } else { >X firstKV = false; >X //write final key length and final value length >X- outputInt(bswap(_key.outerLength)); >X- outputInt(bswap(_value.outerLength)); >X+ outputInt(bswap32(_key.outerLength)); >X+ outputInt(bswap32(_value.outerLength)); >X outputKeyOrValue(_key, _kType); >X outputKeyOrValue(_value, _vType); >X >X@@ -101,7 +101,7 @@ void CombineHandler::outputKeyOrValue(SerializeInfo & >X output(KV.buffer.data(), KV.buffer.length()); >X break; >X case BytesType: >X- outputInt(bswap(KV.buffer.length())); >X+ outputInt(bswap32(KV.buffer.length())); >X output(KV.buffer.data(), KV.buffer.length()); >X break; >X default: >X@@ -202,8 +202,8 @@ void CombineHandler::write(char * buf, uint32_t length >X uint32_t outputRecordCount = 0; >X while (remain > 0) { >X kv = (KVBuffer *)pos; >X- kv->keyLength = bswap(kv->keyLength); >X- kv->valueLength = bswap(kv->valueLength); >X+ kv->keyLength = bswap32(kv->keyLength); >X+ kv->valueLength = bswap32(kv->valueLength); >X _writer->write(kv->getKey(), kv->keyLength, kv->getValue(), kv->valueLength); >X outputRecordCount++; >X remain -= kv->length(); >d8ab517e619f5b5981ebba7fce2a3751 >echo x - hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_src_lib_Iterator.cc >sed 's/^X//' >hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_src_lib_Iterator.cc << 'a059c0c018678cbf351c65e1a2746257' >X--- hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-nativetask/src/main/native/src/lib/Iterator.cc.orig 2018-10-18 18:38:39 UTC >X+++ hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-nativetask/src/main/native/src/lib/Iterator.cc >X@@ -61,7 +61,7 @@ const char * KeyGroupIteratorImpl::nextValue(uint32_t >X case SAME_KEY: { >X if (next()) { >X if (_key.length() == _currentGroupKey.length()) { >X- if (fmemeq(_key.data(), _currentGroupKey.c_str(), _key.length())) { >X+ if (memcmp(_key.data(), _currentGroupKey.c_str(), _key.length()) == 0) { >X len = _value.length(); >X return _value.data(); >X } >a059c0c018678cbf351c65e1a2746257 >echo x - hadoop3/files/patch-hadoop-hdfs-project_hadoop-hdfs-native-client_pom.xml >sed 's/^X//' >hadoop3/files/patch-hadoop-hdfs-project_hadoop-hdfs-native-client_pom.xml << 'f39f39ecc815b7558ae7ad3d27477d85' >X--- hadoop-hdfs-project/hadoop-hdfs-native-client/pom.xml.orig 2018-10-19 09:04:13 UTC >X+++ hadoop-hdfs-project/hadoop-hdfs-native-client/pom.xml >X@@ -216,6 +216,11 @@ http://maven.apache.org/xsd/maven-4.0.0.xsd"> >X <REQUIRE_FUSE>${require.fuse}</REQUIRE_FUSE> >X <REQUIRE_VALGRIND>${require.valgrind}</REQUIRE_VALGRIND> >X <HADOOP_BUILD>1</HADOOP_BUILD> >X+ <Protobuf_USE_STATIC_LIBS>OFF</Protobuf_USE_STATIC_LIBS> >X+ <Protobuf_LIBRARY>protobuf</Protobuf_LIBRARY> >X+ <Protobuf_PROTOC_LIBRARY>protoc</Protobuf_PROTOC_LIBRARY> >X+ <Protobuf_INCLUDE_DIR>/usr/local/protobuf25/include</Protobuf_INCLUDE_DIR> >X+ <PROTOBUF_PROTOC_EXECUTABLE>/usr/local/protobuf25/bin/protoc</PROTOBUF_PROTOC_EXECUTABLE> >X <REQUIRE_LIBWEBHDFS>${require.libwebhdfs}</REQUIRE_LIBWEBHDFS> >X <REQUIRE_OPENSSL>${require.openssl}</REQUIRE_OPENSSL> >X <CUSTOM_OPENSSL_PREFIX>${openssl.prefix}</CUSTOM_OPENSSL_PREFIX> >f39f39ecc815b7558ae7ad3d27477d85 >echo x - hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-core_src_main_java_org_apache_hadoop_mapreduce_util_ProcessTree.java >sed 's/^X//' >hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-core_src_main_java_org_apache_hadoop_mapreduce_util_ProcessTree.java << 'ccbbc6428d054fc896aa67d5fc9e9c71' >X--- hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/src/main/java/org/apache/hadoop/mapreduce/util/ProcessTree.java.orig 2019-04-21 12:22:36 UTC >X+++ hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/src/main/java/org/apache/hadoop/mapreduce/util/ProcessTree.java >X@@ -53,14 +53,14 @@ public class ProcessTree { >X ShellCommandExecutor shexec = null; >X boolean setsidSupported = true; >X try { >X- String[] args = {"setsid", "bash", "-c", "echo $$"}; >X+ String[] args = {"ssid", "bash", "-c", "echo $$"}; >X shexec = new ShellCommandExecutor(args); >X shexec.execute(); >X } catch (IOException ioe) { >X- LOG.warn("setsid is not available on this machine. So not using it."); >X+ LOG.warn("ssid is not available on this machine. So not using it."); >X setsidSupported = false; >X } finally { // handle the exit code >X- LOG.info("setsid exited with exit code " + shexec.getExitCode()); >X+ LOG.info("ssid exited with exit code " + shexec.getExitCode()); >X } >X return setsidSupported; >X } >ccbbc6428d054fc896aa67d5fc9e9c71 >echo x - hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_src_codec_BlockCodec.cc >sed 's/^X//' >hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_src_codec_BlockCodec.cc << '86cb83d692518d5a7d7dac07e9bafe31' >X--- hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-nativetask/src/main/native/src/codec/BlockCodec.cc.orig 2018-10-18 18:38:39 UTC >X+++ hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-nativetask/src/main/native/src/codec/BlockCodec.cc >X@@ -104,8 +104,8 @@ int32_t BlockDecompressStream::read(void * buff, uint3 >X THROW_EXCEPTION(IOException, "readFully get incomplete data"); >X } >X _compressedBytesRead += rd; >X- sizes[0] = bswap(sizes[0]); >X- sizes[1] = bswap(sizes[1]); >X+ sizes[0] = bswap32(sizes[0]); >X+ sizes[1] = bswap32(sizes[1]); >X if (sizes[0] <= length) { >X uint32_t len = decompressOneBlock(sizes[1], buff, sizes[0]); >X if (len != sizes[0]) { >86cb83d692518d5a7d7dac07e9bafe31 >echo x - hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_src_lib_commons.h >sed 's/^X//' >hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_src_lib_commons.h << 'ea851f0212f4e00fdb44672c687d48d4' >X--- hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-nativetask/src/main/native/src/lib/commons.h.orig 2018-10-18 18:38:39 UTC >X+++ hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-nativetask/src/main/native/src/lib/commons.h >X@@ -41,7 +41,6 @@ >X #include <map> >X #include <algorithm> >X >X-#include "lib/primitives.h" >X #include "lib/Log.h" >X #include "NativeTask.h" >X >X@@ -49,4 +48,12 @@ >X >X #include "lib/Iterator.h" >X >X+#ifdef __GNUC__ >X+#define likely(x) __builtin_expect((x),1) >X+#define unlikely(x) __builtin_expect((x),0) >X+#else >X+#define likely(x) (x) >X+#define unlikely(x) (x) >X+#endif >X+ >X #endif /* COMMONS_H_ */ >ea851f0212f4e00fdb44672c687d48d4 >echo x - hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_src_util_WritableUtils.cc >sed 's/^X//' >hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_src_util_WritableUtils.cc << '85afc9502ce5dc081c586b1c818d8d22' >X--- hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-nativetask/src/main/native/src/util/WritableUtils.cc.orig 2018-10-18 18:38:39 UTC >X+++ hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-nativetask/src/main/native/src/util/WritableUtils.cc >X@@ -120,22 +120,22 @@ void WritableUtils::WriteVLongInner(int64_t v, char * >X len = 4; >X } else if (value < (1ULL << 32)) { >X *(pos++) = base - 3; >X- *(uint32_t*)(pos) = bswap((uint32_t)value); >X+ *(uint32_t*)(pos) = bswap32((uint32_t)value); >X len = 5; >X } else if (value < (1ULL << 40)) { >X *(pos++) = base - 4; >X- *(uint32_t*)(pos) = bswap((uint32_t)(value >> 8)); >X+ *(uint32_t*)(pos) = bswap32((uint32_t)(value >> 8)); >X *(uint8_t*)(pos + 4) = value; >X len = 6; >X } else if (value < (1ULL << 48)) { >X *(pos++) = base - 5; >X- *(uint32_t*)(pos) = bswap((uint32_t)(value >> 16)); >X+ *(uint32_t*)(pos) = bswap32((uint32_t)(value >> 16)); >X *(uint8_t*)(pos + 4) = value >> 8; >X *(uint8_t*)(pos + 5) = value; >X len = 7; >X } else if (value < (1ULL << 56)) { >X *(pos++) = base - 6; >X- *(uint32_t*)(pos) = bswap((uint32_t)(value >> 24)); >X+ *(uint32_t*)(pos) = bswap32((uint32_t)(value >> 24)); >X *(uint8_t*)(pos + 4) = value >> 16; >X *(uint8_t*)(pos + 5) = value >> 8; >X *(uint8_t*)(pos + 6) = value; >X@@ -176,7 +176,7 @@ int32_t WritableUtils::ReadInt(InputStream * stream) { >X if (stream->readFully(&ret, 4) != 4) { >X THROW_EXCEPTION(IOException, "ReadInt reach EOF"); >X } >X- return (int32_t)bswap(ret); >X+ return (int32_t)bswap32(ret); >X } >X >X int16_t WritableUtils::ReadShort(InputStream * stream) { >X@@ -192,7 +192,7 @@ float WritableUtils::ReadFloat(InputStream * stream) { >X if (stream->readFully(&ret, 4) != 4) { >X THROW_EXCEPTION(IOException, "ReadFloat reach EOF"); >X } >X- ret = bswap(ret); >X+ ret = bswap32(ret); >X return *(float*)&ret; >X } >X >X@@ -237,7 +237,7 @@ void WritableUtils::WriteLong(OutputStream * stream, i >X } >X >X void WritableUtils::WriteInt(OutputStream * stream, int32_t v) { >X- uint32_t be = bswap((uint32_t)v); >X+ uint32_t be = bswap32((uint32_t)v); >X stream->write(&be, 4); >X } >X >X@@ -249,7 +249,7 @@ void WritableUtils::WriteShort(OutputStream * stream, >X >X void WritableUtils::WriteFloat(OutputStream * stream, float v) { >X uint32_t intv = *(uint32_t*)&v; >X- intv = bswap(intv); >X+ intv = bswap32(intv); >X stream->write(&intv, 4); >X } >X >X@@ -286,7 +286,7 @@ void WritableUtils::toString(string & dest, KeyValueTy >X dest.append(*(uint8_t*)data ? "true" : "false"); >X break; >X case IntType: >X- dest.append(StringUtil::ToString((int32_t)bswap(*(uint32_t*)data))); >X+ dest.append(StringUtil::ToString((int32_t)bswap32(*(uint32_t*)data))); >X break; >X case LongType: >X dest.append(StringUtil::ToString((int64_t)bswap64(*(uint64_t*)data))); >85afc9502ce5dc081c586b1c818d8d22 >echo x - hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_test_TestCompressions.cc >sed 's/^X//' >hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_test_TestCompressions.cc << '5187da8fc5bb2ca2e800bd20bcf53ee6' >X--- hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-nativetask/src/main/native/test/TestCompressions.cc.orig 2019-04-21 10:30:06 UTC >X+++ hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-nativetask/src/main/native/test/TestCompressions.cc >X@@ -269,7 +269,7 @@ TEST(Perf, RawCompressionSnappy) { >X vector<FileEntry> inputfiles; >X FileSystem::getLocal().list(inputdir, inputfiles); >X CompressResult total; >X- printf("Block size: %"PRId64"K\n", blockSize / 1024); >X+ printf("Block size: %" PRId64"K\n", blockSize / 1024); >X for (size_t i = 0; i < inputfiles.size(); i++) { >X if (!inputfiles[i].isDirectory) { >X MeasureSingleFileSnappy((inputdir + "/" + inputfiles[i].name).c_str(), total, blockSize, >5187da8fc5bb2ca2e800bd20bcf53ee6 >echo x - hadoop3/files/patch-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-nodemanager_src_main_native_container-executor_test_test__configuration.cc >sed 's/^X//' >hadoop3/files/patch-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-nodemanager_src_main_native_container-executor_test_test__configuration.cc << '8f1e8407ca857ad499f9c4465f285282' >X--- hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/native/container-executor/test/test_configuration.cc.orig 2018-10-18 18:38:40 UTC >X+++ hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/native/container-executor/test/test_configuration.cc >X@@ -18,6 +18,7 @@ >X >X #include <gtest/gtest.h> >X #include <fstream> >X+#include <istream> >X >X extern "C" { >X #include "util.h" >8f1e8407ca857ad499f9c4465f285282 >echo x - hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_src_lib_Buffers.h >sed 's/^X//' >hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_src_lib_Buffers.h << '3677062615dd17dd4dbb8bb420905001' >X--- hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-nativetask/src/main/native/src/lib/Buffers.h.orig 2018-10-18 18:38:39 UTC >X+++ hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-nativetask/src/main/native/src/lib/Buffers.h >X@@ -79,11 +79,11 @@ class ReadBuffer { (public) >X } >X >X /** >X- * read to outside buffer, use simple_memcpy >X+ * read to outside buffer, use memcpy >X */ >X inline void readUnsafe(char * buff, uint32_t len) { >X if (likely(len <= _remain)) { >X- simple_memcpy(buff, current(), len); >X+ memcpy(buff, current(), len); >X _remain -= len; >X return; >X } >X@@ -115,7 +115,7 @@ class ReadBuffer { (public) >X * read uint32_t big endian >X */ >X inline uint32_t read_uint32_be() { >X- return bswap(read_uint32_le()); >X+ return bswap32(read_uint32_le()); >X } >X }; >X >X@@ -181,7 +181,7 @@ class AppendBuffer { (public) >X >X inline void write(const void * data, uint32_t len) { >X if (likely(len <= _remain)) { // append directly >X- simple_memcpy(current(), data, len); >X+ memcpy(current(), data, len); >X _remain -= len; >X return; >X } >X@@ -198,7 +198,7 @@ class AppendBuffer { (public) >X } >X >X inline void write_uint32_be(uint32_t v) { >X- write_uint32_le(bswap(v)); >X+ write_uint32_le(bswap32(v)); >X } >X >X inline void write_uint64_le(uint64_t v) { >X@@ -291,10 +291,10 @@ struct KVBuffer { >X valueLength = vallen; >X >X if (keylen > 0) { >X- simple_memcpy(getKey(), key, keylen); >X+ memcpy(getKey(), key, keylen); >X } >X if (vallen > 0) { >X- simple_memcpy(getValue(), value, vallen); >X+ memcpy(getValue(), value, vallen); >X } >X } >X >X@@ -479,7 +479,7 @@ class FixSizeContainer { (public) >X } >X uint32_t remain = _size - _pos; >X uint32_t length = (maxSize < remain) ? maxSize : remain; >X- simple_memcpy(_buff + _pos, source, length); >X+ memcpy(_buff + _pos, source, length); >X _pos += length; >X return length; >X } >3677062615dd17dd4dbb8bb420905001 >echo x - hadoop3/files/patch-hadoop-common-project-hadoop-common-src-main-java-org-apache-hadoop-util-StringUtils.java >sed 's/^X//' >hadoop3/files/patch-hadoop-common-project-hadoop-common-src-main-java-org-apache-hadoop-util-StringUtils.java << '260ea50f932f1c222bffcc8032e43901' >X--- hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/StringUtils.java.orig 2018-07-07 08:16:53 UTC >X+++ hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/StringUtils.java >X@@ -713,7 +713,7 @@ public class StringUtils { >X final String classname = clazz.getSimpleName(); >X LOG.info(createStartupShutdownMessage(classname, hostname, args)); >X >X- if (SystemUtils.IS_OS_UNIX) { >X+ if (true) { >X try { >X SignalLogger.INSTANCE.register(LOG); >X } catch (Throwable t) { >260ea50f932f1c222bffcc8032e43901 >echo x - hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_src_lib_Buffers.cc >sed 's/^X//' >hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_src_lib_Buffers.cc << 'aef38693472df8f8b62b5ac3b6728b54' >X--- hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-nativetask/src/main/native/src/lib/Buffers.cc.orig 2018-10-18 18:38:39 UTC >X+++ hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-nativetask/src/main/native/src/lib/Buffers.cc >X@@ -206,7 +206,7 @@ void AppendBuffer::write_inner(const void * data, uint >X _dest->write(data, len); >X _counter += len; >X } else { >X- simple_memcpy(_buff, data, len); >X+ memcpy(_buff, data, len); >X _remain -= len; >X } >X } >aef38693472df8f8b62b5ac3b6728b54 >echo x - hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_src_codec_Lz4Codec.cc >sed 's/^X//' >hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_src_codec_Lz4Codec.cc << '2ab81252161ca5c721386e5a5a380af3' >X--- hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-nativetask/src/main/native/src/codec/Lz4Codec.cc.orig 2018-10-18 18:38:39 UTC >X+++ hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-nativetask/src/main/native/src/codec/Lz4Codec.cc >X@@ -38,8 +38,8 @@ void Lz4CompressStream::compressOneBlock(const void * >X int ret = LZ4_compress((char*)buff, _tempBuffer + 8, length); >X if (ret > 0) { >X compressedLength = ret; >X- ((uint32_t*)_tempBuffer)[0] = bswap(length); >X- ((uint32_t*)_tempBuffer)[1] = bswap((uint32_t)compressedLength); >X+ ((uint32_t*)_tempBuffer)[0] = bswap32(length); >X+ ((uint32_t*)_tempBuffer)[1] = bswap32((uint32_t)compressedLength); >X _stream->write(_tempBuffer, compressedLength + 8); >X _compressedBytesWritten += (compressedLength + 8); >X } else { >2ab81252161ca5c721386e5a5a380af3 >echo x - hadoop3/files/httpfs-env.sh.in >sed 's/^X//' >hadoop3/files/httpfs-env.sh.in << '3318a0ce8f8f41201de0b4b9284db1d4' >X# $FreeBSD$ >X >Xexport HTTPFS_LOG=/var/log/hadoop >Xexport HTTPFS_TEMP=/var/tmp >3318a0ce8f8f41201de0b4b9284db1d4 >echo x - hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_test_lib_TestMemBlockIterator.cc >sed 's/^X//' >hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_test_lib_TestMemBlockIterator.cc << '33c9dbaad1ad08f1b994fb2be24af091' >X--- hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-nativetask/src/main/native/test/lib/TestMemBlockIterator.cc.orig 2018-10-18 18:38:39 UTC >X+++ hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-nativetask/src/main/native/test/lib/TestMemBlockIterator.cc >X@@ -59,7 +59,7 @@ class MemoryBlockFactory { >X kv->keyLength = 4; >X kv->valueLength = 4; >X uint32_t * key = (uint32_t *)kv->getKey(); >X- *key = bswap(index); >X+ *key = bswap32(index); >X } >X return block1; >X } >33c9dbaad1ad08f1b994fb2be24af091 >echo x - hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_test_lib_TestPartitionBucket.cc >sed 's/^X//' >hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_test_lib_TestPartitionBucket.cc << 'e989e4ece10e61b50e140fa7ab0bf315' >X--- hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-nativetask/src/main/native/test/lib/TestPartitionBucket.cc.orig 2018-10-18 18:38:39 UTC >X+++ hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-nativetask/src/main/native/test/lib/TestPartitionBucket.cc >X@@ -129,15 +129,15 @@ TEST(PartitionBucket, sort) { >X const uint32_t BIG = 1000; >X >X kv1->keyLength = 4; >X- *((uint32_t *)kv1->getKey()) = bswap(BIG); >X+ *((uint32_t *)kv1->getKey()) = bswap32(BIG); >X kv1->valueLength = KV_SIZE - kv1->headerLength() - kv1->keyLength; >X >X kv2->keyLength = 4; >X- *((uint32_t *)kv2->getKey()) = bswap(SMALL); >X+ *((uint32_t *)kv2->getKey()) = bswap32(SMALL); >X kv2->valueLength = KV_SIZE - kv2->headerLength() - kv2->keyLength; >X >X kv3->keyLength = 4; >X- *((uint32_t *)kv3->getKey()) = bswap(MEDIUM); >X+ *((uint32_t *)kv3->getKey()) = bswap32(MEDIUM); >X kv3->valueLength = KV_SIZE - kv3->headerLength() - kv3->keyLength; >X >X bucket->sort(DUALPIVOTSORT); >X@@ -148,13 +148,13 @@ TEST(PartitionBucket, sort) { >X Buffer value; >X iter->next(key, value); >X >X- ASSERT_EQ(SMALL, bswap(*(uint32_t * )key.data())); >X+ ASSERT_EQ(SMALL, bswap32(*(uint32_t * )key.data())); >X >X iter->next(key, value); >X- ASSERT_EQ(MEDIUM, bswap(*(uint32_t * )key.data())); >X+ ASSERT_EQ(MEDIUM, bswap32(*(uint32_t * )key.data())); >X >X iter->next(key, value); >X- ASSERT_EQ(BIG, bswap(*(uint32_t * )key.data())); >X+ ASSERT_EQ(BIG, bswap32(*(uint32_t * )key.data())); >X >X delete iter; >X delete bucket; >X@@ -181,15 +181,15 @@ TEST(PartitionBucket, spill) { >X const uint32_t BIG = 1000; >X >X kv1->keyLength = 4; >X- *((uint32_t *)kv1->getKey()) = bswap(BIG); >X+ *((uint32_t *)kv1->getKey()) = bswap32(BIG); >X kv1->valueLength = KV_SIZE - KVBuffer::headerLength() - kv1->keyLength; >X >X kv2->keyLength = 4; >X- *((uint32_t *)kv2->getKey()) = bswap(SMALL); >X+ *((uint32_t *)kv2->getKey()) = bswap32(SMALL); >X kv2->valueLength = KV_SIZE - KVBuffer::headerLength() - kv2->keyLength; >X >X kv3->keyLength = 4; >X- *((uint32_t *)kv3->getKey()) = bswap(MEDIUM); >X+ *((uint32_t *)kv3->getKey()) = bswap32(MEDIUM); >X kv3->valueLength = KV_SIZE - KVBuffer::headerLength() - kv3->keyLength; >X >X bucket->sort(DUALPIVOTSORT); >X@@ -203,17 +203,17 @@ TEST(PartitionBucket, spill) { >X KVBuffer * first = (KVBuffer *)writer.buff(); >X ASSERT_EQ(4, first->keyLength); >X ASSERT_EQ(KV_SIZE - KVBuffer::headerLength() - 4, first->valueLength); >X- ASSERT_EQ(bswap(SMALL), (*(uint32_t * )(first->getKey()))); >X+ ASSERT_EQ(bswap32(SMALL), (*(uint32_t * )(first->getKey()))); >X >X KVBuffer * second = first->next(); >X ASSERT_EQ(4, second->keyLength); >X ASSERT_EQ(KV_SIZE - KVBuffer::headerLength() - 4, second->valueLength); >X- ASSERT_EQ(bswap(MEDIUM), (*(uint32_t * )(second->getKey()))); >X+ ASSERT_EQ(bswap32(MEDIUM), (*(uint32_t * )(second->getKey()))); >X >X KVBuffer * third = second->next(); >X ASSERT_EQ(4, third->keyLength); >X ASSERT_EQ(KV_SIZE - KVBuffer::headerLength() - 4, third->valueLength); >X- ASSERT_EQ(bswap(BIG), (*(uint32_t * )(third->getKey()))); >X+ ASSERT_EQ(bswap32(BIG), (*(uint32_t * )(third->getKey()))); >X >X delete [] buff; >X delete bucket; >e989e4ece10e61b50e140fa7ab0bf315 >echo x - hadoop3/files/patch-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-nodemanager_src_main_native_container-executor_impl_configuration.c >sed 's/^X//' >hadoop3/files/patch-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-nodemanager_src_main_native_container-executor_impl_configuration.c << '472f3cec1163a20ea25b928d3d27c532' >X--- hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/native/container-executor/impl/configuration.c.orig 2018-10-18 18:38:40 UTC >X+++ hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/native/container-executor/impl/configuration.c >X@@ -27,9 +27,11 @@ >X #include <inttypes.h> >X #include <errno.h> >X #include <unistd.h> >X+#include <stdio.h> >X #include <stdlib.h> >X #include <string.h> >X #include <sys/stat.h> >X+#include <sys/types.h> >X >X #define MAX_SIZE 10 >X >472f3cec1163a20ea25b928d3d27c532 >echo x - hadoop3/files/historyserver.in >sed 's/^X//' >hadoop3/files/historyserver.in << 'a2dd5af92c82f8c4592d6370df1eb5bb' >X#!/bin/sh >X# >X# $FreeBSD$ >X# >X# PROVIDE: historyserver >X# REQUIRE: LOGIN >X# KEYWORD: shutdown >X# >X# historyserver_enable (bool): Set to NO by default. >X# Set it to YES to enable resourcemanager. >X >X. /etc/rc.subr >X >Xexport PATH=${PATH}:%%LOCALBASE%%/bin >Xname=historyserver >Xrcvar=historyserver_enable >X >Xload_rc_config "${name}" >X >X: ${historyserver_enable:=NO} >X: ${historyserver_user:=%%MAPRED_USER%%} >X >Xcommand="%%PREFIX%%/sbin/mr-jobhistory-daemon.sh" >Xcommand_args='--config %%ETCDIR%% start historyserver' >X >Xstop_cmd=historyserver_stop >X >Xhistoryserver_stop () { >X su -m ${historyserver_user} -c "${command} --config %%ETCDIR%% stop historyserver" >X} >X >Xrun_rc_command "$1" >a2dd5af92c82f8c4592d6370df1eb5bb >echo x - hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_src_handler_BatchHandler.h >sed 's/^X//' >hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_src_handler_BatchHandler.h << '6329fda60dcdb2ffa302bc8462fa7ead' >X--- hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-nativetask/src/main/native/src/handler/BatchHandler.h.orig 2018-10-18 18:38:39 UTC >X+++ hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-nativetask/src/main/native/src/handler/BatchHandler.h >X@@ -108,7 +108,7 @@ class BatchHandler : public Configurable { (protected) >X flushOutput(); >X } >X uint32_t cp = length < remain ? length : remain; >X- simple_memcpy(_out.current(), buff, cp); >X+ memcpy(_out.current(), buff, cp); >X buff += cp; >X length -= cp; >X _out.advance(cp); >6329fda60dcdb2ffa302bc8462fa7ead >echo x - hadoop3/files/patch-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-nodemanager_pom.xml >sed 's/^X//' >hadoop3/files/patch-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-nodemanager_pom.xml << 'e90bd131f010871c016e1d1b8c0ae809' >X--- hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/pom.xml.orig 2019-07-27 18:33:18 UTC >X+++ hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/pom.xml >X@@ -311,6 +311,27 @@ >X </plugins> >X </build> >X </profile> >X+ <profile> >X+ <id>native-bsd</id> >X+ <activation> >X+ <os> >X+ <family>FreeBSD</family> >X+ </os> >X+ </activation> >X+ <build> >X+ <plugins> >X+ <plugin> >X+ <groupId>org.apache.maven.plugins</groupId> >X+ <artifactId>maven-surefire-plugin</artifactId> >X+ <configuration> >X+ <excludes> >X+ <exclude>org.apache.hadoop.yarn.server.nodemanager.containermanager.linux.**</exclude> >X+ </excludes> >X+ </configuration> >X+ </plugin> >X+ </plugins> >X+ </build> >X+ </profile> >X </profiles> >X >X <build> >e90bd131f010871c016e1d1b8c0ae809 >echo x - hadoop3/files/patch-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-nodemanager_src_main_java_org_apache_hadoop_yarn_server_nodemanager_DefaultContainerExecutor.java >sed 's/^X//' >hadoop3/files/patch-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-nodemanager_src_main_java_org_apache_hadoop_yarn_server_nodemanager_DefaultContainerExecutor.java << '89d0449cdf51c59559f63896cfbd3301' >X--- hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/java/org/apache/hadoop/yarn/server/nodemanager/DefaultContainerExecutor.java.orig 2018-10-19 02:30:34 UTC >X+++ hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/java/org/apache/hadoop/yarn/server/nodemanager/DefaultContainerExecutor.java >X@@ -480,8 +480,8 @@ public class DefaultContainerExecutor ex >X String exitCodeFile = ContainerLaunch.getExitCodeFile( >X pidFile.toString()); >X String tmpFile = exitCodeFile + ".tmp"; >X- pout.println("#!/bin/bash"); >X- pout.println("/bin/bash \"" + sessionScriptPath.toString() + "\""); >X+ pout.println("#!/usr/local/bin/bash"); >X+ pout.println("/usr/local/bin/bash \"" + sessionScriptPath.toString() + "\""); >X pout.println("rc=$?"); >X pout.println("echo $rc > \"" + tmpFile + "\""); >X pout.println("/bin/mv -f \"" + tmpFile + "\" \"" + exitCodeFile + "\""); >X@@ -497,12 +497,12 @@ public class DefaultContainerExecutor ex >X // We need to do a move as writing to a file is not atomic >X // Process reading a file being written to may get garbled data >X // hence write pid to tmp file first followed by a mv >X- pout.println("#!/bin/bash"); >X+ pout.println("#!/usr/local/bin/bash"); >X pout.println(); >X pout.println("echo $$ > " + pidFile.toString() + ".tmp"); >X pout.println("/bin/mv -f " + pidFile.toString() + ".tmp " + pidFile); >X- String exec = Shell.isSetsidAvailable? "exec setsid" : "exec"; >X- pout.printf("%s /bin/bash \"%s\"", exec, launchDst.toUri().getPath()); >X+ String exec = Shell.isSetsidAvailable? "exec ssid" : "exec"; >X+ pout.printf("%s /usr/local/bin/bash \"%s\"", exec, launchDst.toUri().getPath()); >X } >X lfs.setPermission(sessionScriptPath, >X ContainerExecutor.TASK_LAUNCH_SCRIPT_PERMISSION); >89d0449cdf51c59559f63896cfbd3301 >echo x - hadoop3/files/patch-hadoop-common-project_hadoop-common_src_main_java_org_apache_hadoop_util_Shell.java >sed 's/^X//' >hadoop3/files/patch-hadoop-common-project_hadoop-common_src_main_java_org_apache_hadoop_util_Shell.java << 'cf85187bb15138a4a23a4c6933d3197d' >X--- hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/Shell.java.orig 2019-04-21 12:19:26 UTC >X+++ hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/Shell.java >X@@ -796,14 +796,14 @@ public abstract class Shell { >X ShellCommandExecutor shexec = null; >X boolean setsidSupported = true; >X try { >X- String[] args = {"setsid", "bash", "-c", "echo $$"}; >X+ String[] args = {"ssid", "bash", "-c", "echo $$"}; >X shexec = new ShellCommandExecutor(args); >X shexec.execute(); >X } catch (IOException ioe) { >X- LOG.debug("setsid is not available on this machine. So not using it."); >X+ LOG.debug("ssid is not available on this machine. So not using it."); >X setsidSupported = false; >X } catch (SecurityException se) { >X- LOG.debug("setsid is not allowed to run by the JVM "+ >X+ LOG.debug("ssid is not allowed to run by the JVM "+ >X "security manager. So not using it."); >X setsidSupported = false; >X } catch (Error err) { >X@@ -818,7 +818,7 @@ public abstract class Shell { >X } >X } finally { // handle the exit code >X if (LOG.isDebugEnabled()) { >X- LOG.debug("setsid exited with exit code " >X+ LOG.debug("ssid exited with exit code " >X + (shexec != null ? shexec.getExitCode() : "(null executor)")); >X } >X } >cf85187bb15138a4a23a4c6933d3197d >echo x - hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_src_lib_NativeObjectFactory.cc >sed 's/^X//' >hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_src_lib_NativeObjectFactory.cc << '88df2ecaf52d6092d2aad662053f655b' >X--- hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-nativetask/src/main/native/src/lib/NativeObjectFactory.cc.orig 2018-10-18 18:38:39 UTC >X+++ hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-nativetask/src/main/native/src/lib/NativeObjectFactory.cc >X@@ -299,7 +299,7 @@ int NativeObjectFactory::BytesComparator(const char * >X uint32_t destLength) { >X >X uint32_t minlen = std::min(srcLength, destLength); >X- int64_t ret = fmemcmp(src, dest, minlen); >X+ int64_t ret = memcmp(src, dest, minlen); >X if (ret > 0) { >X return 1; >X } else if (ret < 0) { >X@@ -317,8 +317,8 @@ int NativeObjectFactory::IntComparator(const char * sr >X uint32_t destLength) { >X int result = (*src) - (*dest); >X if (result == 0) { >X- uint32_t from = bswap(*(uint32_t*)src); >X- uint32_t to = bswap(*(uint32_t*)dest); >X+ uint32_t from = bswap32(*(uint32_t*)src); >X+ uint32_t to = bswap32(*(uint32_t*)dest); >X if (from > to) { >X return 1; >X } else if (from == to) { >X@@ -380,8 +380,8 @@ int NativeObjectFactory::FloatComparator(const char * >X THROW_EXCEPTION_EX(IOException, "float comparator, while src/dest lengt is not 4"); >X } >X >X- uint32_t from = bswap(*(uint32_t*)src); >X- uint32_t to = bswap(*(uint32_t*)dest); >X+ uint32_t from = bswap32(*(uint32_t*)src); >X+ uint32_t to = bswap32(*(uint32_t*)dest); >X >X float * srcValue = (float *)(&from); >X float * destValue = (float *)(&to); >88df2ecaf52d6092d2aad662053f655b >echo x - hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_test_TestPrimitives.cc >sed 's/^X//' >hadoop3/files/patch-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-nativetask_src_main_native_test_TestPrimitives.cc << 'e55e87f322a3c886020fabc6ad4d91f5' >X--- hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-nativetask/src/main/native/test/TestPrimitives.cc.orig 2018-10-18 18:38:39 UTC >X+++ hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-nativetask/src/main/native/test/TestPrimitives.cc >X@@ -18,98 +18,7 @@ >X >X #include "test_commons.h" >X >X-TEST(Primitives, fmemcmp) { >X- std::vector<std::string> vs; >X- char buff[14]; >X- vs.push_back(""); >X- for (uint32_t i = 0; i < 5000; i += 7) { >X- snprintf(buff, 14, "%d", i * 31); >X- vs.push_back(buff); >X- snprintf(buff, 10, "%010d", i); >X- vs.push_back(buff); >X- } >X- for (size_t i = 0; i < vs.size(); i++) { >X- for (size_t j = 0; j < vs.size(); j++) { >X- std::string & ls = vs[i]; >X- std::string & rs = vs[j]; >X- size_t m = std::min(ls.length(), rs.length()); >X- int c = memcmp(ls.c_str(), rs.c_str(), m); >X- int t = fmemcmp(ls.c_str(), rs.c_str(), m); >X- if (!((c == 0 && t == 0) || (c > 0 && t > 0) || (c < 0 && t < 0))) { >X- ASSERT_TRUE(false); >X- } >X- } >X- } >X-} >X- >X-static int test_memcmp() { >X- uint8_t buff[2048]; >X- for (uint32_t i = 0; i < 2048; i++) { >X- buff[i] = i & 0xff; >X- } >X- std::random_shuffle(buff, buff + 2048); >X- int r = 0; >X- for (uint32_t i = 0; i < 100000000; i++) { >X- int offset = i % 1000; >X- r += memcmp(buff, buff + 1024, 5); >X- r += memcmp(buff + offset, buff + 1124, 9); >X- r += memcmp(buff + offset, buff + 1224, 10); >X- r += memcmp(buff + offset, buff + 1324, 15); >X- r += memcmp(buff + offset, buff + 1424, 16); >X- r += memcmp(buff + offset, buff + 1524, 17); >X- r += memcmp(buff + offset, buff + 1624, 18); >X- r += memcmp(buff + offset, buff + 1724, 19); >X- } >X- return r; >X-} >X- >X-static int test_fmemcmp() { >X- char buff[2048]; >X- for (uint32_t i = 0; i < 2048; i++) { >X- buff[i] = i & 0xff; >X- } >X- std::random_shuffle(buff, buff + 2048); >X- int r = 0; >X- for (uint32_t i = 0; i < 100000000; i++) { >X- int offset = i % 1000; >X- r += fmemcmp(buff, buff + 1024, 5); >X- r += fmemcmp(buff + offset, buff + 1124, 9); >X- r += fmemcmp(buff + offset, buff + 1224, 10); >X- r += fmemcmp(buff + offset, buff + 1324, 15); >X- r += fmemcmp(buff + offset, buff + 1424, 16); >X- r += fmemcmp(buff + offset, buff + 1524, 17); >X- r += fmemcmp(buff + offset, buff + 1624, 18); >X- r += fmemcmp(buff + offset, buff + 1724, 19); >X- } >X- return r; >X-} >X- >X-TEST(Perf, fmemcmp) { >X- Timer t; >X- int a = test_memcmp(); >X- LOG("%s", t.getInterval(" memcmp ").c_str()); >X- t.reset(); >X- int b = test_fmemcmp(); >X- LOG("%s", t.getInterval(" fmemcmp ").c_str()); >X- // prevent compiler optimization >X- TestConfig.setInt("tempvalue", a + b); >X-} >X- >X-static void test_memcpy_perf_len(char * src, char * dest, size_t len, size_t time) { >X- for (size_t i = 0; i < time; i++) { >X- memcpy(src, dest, len); >X- memcpy(dest, src, len); >X- } >X-} >X- >X-static void test_simple_memcpy_perf_len(char * src, char * dest, size_t len, size_t time) { >X- for (size_t i = 0; i < time; i++) { >X- simple_memcpy(src, dest, len); >X- simple_memcpy(dest, src, len); >X- } >X-} >X- >X-TEST(Perf, simple_memcpy_small) { >X+TEST(Perf, memcpy_small) { >X char * src = new char[10240]; >X char * dest = new char[10240]; >X char buff[32]; >X@@ -117,11 +26,10 @@ TEST(Perf, simple_memcpy_small) { >X LOG("------------------------------"); >X snprintf(buff, 32, " memcpy %luB\t", len); >X Timer t; >X- test_memcpy_perf_len(src, dest, len, 1000000); >X- LOG("%s", t.getInterval(buff).c_str()); >X- snprintf(buff, 32, "simple_memcpy %luB\t", len); >X- t.reset(); >X- test_simple_memcpy_perf_len(src, dest, len, 1000000); >X+ for (size_t i = 0; i < 1000000; i++) { >X+ memcpy(src, dest, len); >X+ memcpy(dest, src, len); >X+ } >X LOG("%s", t.getInterval(buff).c_str()); >X } >X delete[] src; >X@@ -293,11 +201,6 @@ TEST(Perf, memcpy_batch) { >X memcpy(dest, src, size); >X } >X LOG("%s", t.getSpeedM("memcpy", mb).c_str()); >X- t.reset(); >X- for (size_t i = 0; i < mb; i += size) { >X- simple_memcpy(dest, src, size); >X- } >X- LOG("%s", t.getSpeedM("simple_memcpy", mb).c_str()); >X delete[] src; >X delete[] dest; >X } >e55e87f322a3c886020fabc6ad4d91f5 >echo x - hadoop3/files/patch-hadoop-common-project_hadoop-common_src_main_java_org_apache_hadoop_io_nativeio_SharedFileDescriptorFactory.java >sed 's/^X//' >hadoop3/files/patch-hadoop-common-project_hadoop-common_src_main_java_org_apache_hadoop_io_nativeio_SharedFileDescriptorFactory.java << 'ae02f1427924bb96c4979fd296ee15a7' >X--- hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/nativeio/SharedFileDescriptorFactory.java.orig 2018-03-21 17:57:55 UTC >X+++ hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/nativeio/SharedFileDescriptorFactory.java >X@@ -54,7 +54,7 @@ public class SharedFileDescriptorFactory { >X if (!NativeIO.isAvailable()) { >X return "NativeIO is not available."; >X } >X- if (!SystemUtils.IS_OS_UNIX) { >X+ if (false) { >X return "The OS is not UNIX."; >X } >X return null; >ae02f1427924bb96c4979fd296ee15a7 >echo x - hadoop3/files/zkfc.in >sed 's/^X//' >hadoop3/files/zkfc.in << '0bc196ba531e4c73615551740a55b271' >X#!/bin/sh >X# >X# $FreeBSD$ >X# >X# PROVIDE: zkfc >X# REQUIRE: LOGIN >X# KEYWORD: shutdown >X# >X# zkfc_enable (bool): Set to NO by default. >X# Set it to YES to enable zkfc. >X >X. /etc/rc.subr >X >Xexport PATH=${PATH}:%%LOCALBASE%%/bin >Xname=zkfc >Xrcvar=zkfc_enable >X >Xload_rc_config "${name}" >X >X: ${zkfc_enable:=NO} >X: ${zkfc_user:=%%HDFS_USER%%} >X >Xcommand="%%PREFIX%%/sbin/hadoop-daemon.sh" >Xcommand_interpreter_execution="%%JAVA_HOME%%/bin/java" >Xcommand_args='--config %%ETCDIR%% start zkfc' >X >Xstop_cmd=zkfc_stop >Xstatus_precmd=find_pid >X >Xzkfc_stop () { >X su -m ${zkfc_user} -c "${command} --config %%ETCDIR%% stop zkfc" >X} >X >Xfind_pid () { >X rc_pid=$(check_pidfile $pidfile $command_interpreter_execution) >X} >X >Xrun_rc_command "$1" >0bc196ba531e4c73615551740a55b271 >echo x - hadoop3/files/patch-hadoop-hdfs-project_hadoop-hdfs-native-client_src_main_native_libhdfspp_CMakeLists.txt >sed 's/^X//' >hadoop3/files/patch-hadoop-hdfs-project_hadoop-hdfs-native-client_src_main_native_libhdfspp_CMakeLists.txt << '236384ad432dfdeb95138f5ec830a56f' >X--- hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/CMakeLists.txt.orig 2018-10-31 07:05:58 UTC >X+++ hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/CMakeLists.txt >X@@ -40,6 +40,8 @@ SET(CMAKE_PREFIX_PATH "${CMAKE_PREFIX_PATH};${CYRUS_SA >X # Specify PROTOBUF_HOME so that find_package picks up the correct version >X SET(CMAKE_PREFIX_PATH "${CMAKE_PREFIX_PATH};$ENV{PROTOBUF_HOME}") >X >X+include(FindProtobuf) >X+include(FindThreads) >X find_package(Doxygen) >X find_package(OpenSSL REQUIRED) >X find_package(Protobuf REQUIRED) >X@@ -144,11 +146,11 @@ add_definitions(-DASIO_STANDALONE -DASIO_CPP11_DATE_TI >X >X # Disable optimizations if compiling debug >X set(CMAKE_CXX_FLAGS_DEBUG "${CMAKE_CXX_FLAGS_DEBUG} -O0") >X-set(CMAKE_C_FLAGS_DEBUG "${CMAKE_C_FLAGS_DEBUG} -O0") >X+set(CMAKE_C_FLAGS_DEBUG "${CMAKE_C_FLAGS_DEBUG} -O0 ") >X >X if(UNIX) >X-set (CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -Wall -Wextra -pedantic -std=c++11 -g -fPIC -fno-strict-aliasing") >X-set (CMAKE_C_FLAGS "${CMAKE_C_FLAGS} -g -fPIC -fno-strict-aliasing") >X+set (CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -Wall -Wextra -pedantic -std=c++11 -g -fPIC -fno-strict-aliasing -L/usr/local/protobuf25/lib -I/usr/local/protobuf25/include -Wl,-rpath=/usr/local/protobuf25/lib") >X+set (CMAKE_C_FLAGS "${CMAKE_C_FLAGS} -g -fPIC -fno-strict-aliasing -L/usr/local/protobuf25/lib -I/usr/local/protobuf25/include -Wl,-rpath=/usr/local/protobuf25/lib") >X endif() >X >X if (CMAKE_CXX_COMPILER_ID STREQUAL "Clang") >X@@ -231,7 +233,6 @@ include_directories( SYSTEM >X ${PROTOBUF_INCLUDE_DIRS} >X ) >X >X- >X add_subdirectory(third_party/gmock-1.7.0) >X add_subdirectory(third_party/uriparser2) >X add_subdirectory(lib) >X@@ -257,20 +258,20 @@ if (HADOOP_BUILD) >X hadoop_add_dual_library(hdfspp ${EMPTY_FILE_CC} ${LIBHDFSPP_ALL_OBJECTS}) >X hadoop_target_link_dual_libraries(hdfspp >X ${LIB_DL} >X- ${PROTOBUF_LIBRARY} >X+ ${PROTOBUF_LIBRARIES} >X ${OPENSSL_LIBRARIES} >X ${SASL_LIBRARIES} >X- ${CMAKE_THREAD_LIBS_INIT} >X+ Threads::Threads >X ) >X set_target_properties(hdfspp PROPERTIES SOVERSION ${LIBHDFSPP_VERSION}) >X else (HADOOP_BUILD) >X add_library(hdfspp_static STATIC ${EMPTY_FILE_CC} ${LIBHDFSPP_ALL_OBJECTS}) >X target_link_libraries(hdfspp_static >X ${LIB_DL} >X- ${PROTOBUF_LIBRARY} >X+ ${PROTOBUF_LIBRARIES} >X ${OPENSSL_LIBRARIES} >X ${SASL_LIBRARIES} >X- ${CMAKE_THREAD_LIBS_INIT} >X+ Threads::Threads >X ) >X if(BUILD_SHARED_HDFSPP) >X add_library(hdfspp SHARED ${EMPTY_FILE_CC} ${LIBHDFSPP_ALL_OBJECTS}) >236384ad432dfdeb95138f5ec830a56f >echo x - hadoop3/files/webappproxyserver.in >sed 's/^X//' >hadoop3/files/webappproxyserver.in << '7fd8c9c8319f3ea0ac6ed902521eaafe' >X#!/bin/sh >X# >X# $FreeBSD$ >X# >X# PROVIDE: webappproxyserver >X# REQUIRE: LOGIN >X# KEYWORD: shutdown >X# >X# webappproxyserver_enable (bool): Set to NO by default. >X# Set it to YES to enable webappproxyserver. >X >X. /etc/rc.subr >X >Xexport PATH=${PATH}:%%LOCALBASE%%/bin >Xname=webappproxyserver >Xrcvar=webappproxyserver_enable >X >Xload_rc_config "${name}" >X >X: ${webappproxyserver_enable:=NO} >X: ${webappproxyserver_user:=%%MAPRED_USER%%} >X >Xcommand="%%PREFIX%%/sbin/yarn-daemon.sh" >Xcommand_args='--config %%ETCDIR%% start proxyserver' >X >Xstop_cmd=webappproxyserver_stop >X >Xwebappproxyserver_stop () { >X su -m ${webappproxyserver_user} -c "${command} --config %%ETCDIR%% stop proxyserver" >X} >X >Xrun_rc_command "$1" >7fd8c9c8319f3ea0ac6ed902521eaafe >echo x - hadoop3/Makefile >sed 's/^X//' >hadoop3/Makefile << '2ffa086fbc1c64775833e7239b4785ed' >X# Created by: Johannes Meixner <johannes@perceivon.net> >X# $FreeBSD$ >X >X# Please do not submit untested updates. Be sure to start hadoop in >X# distributed mode and to run few map/reduce. Be sure there are no exception >X# in any of it's log files. This version was tested under the load and >X# no problems encountered so far. Thanks. >XPORTNAME= hadoop >XPORTVERSION= 3.2.0 >XCATEGORIES= devel java >XMASTER_SITES= APACHE/${PORTNAME}/common/hadoop-${PORTVERSION} \ >X http://xmj.me/freebsd/:maven \ >X http://archive.apache.org/dist/tomcat/tomcat-6/v${TOMCAT_VERSION}/bin/:tomcat >X # please mirror the binary hosted on xmj.me in the local-distfiles space >X #LOCAL/demon/:maven \ # and change this site >X #LOCAL/demon/:jetty >XPKGNAMESUFFIX= 3 >XDISTNAME= ${PORTNAME}-${PORTVERSION}-src >XDISTFILES= ${DISTNAME}${EXTRACT_SUFX} apache-tomcat-${TOMCAT_VERSION}.tar.gz:tomcat FreeBSD-${PORTNAME}3-${PORTVERSION}-maven-repository.tgz:maven >XDIST_SUBDIR= hadoop >XEXTRACT_ONLY= ${DISTNAME}${EXTRACT_SUFX} >XEXTRACT_ONLY= ${DISTNAME}${EXTRACT_SUFX} FreeBSD-${PORTNAME}3-${PORTVERSION}-maven-repository.tgz >X >XMAINTAINER= demon@FreeBSD.org >XCOMMENT= Apache Map/Reduce framework >X >XLICENSE= APACHE20 >X >XBROKEN_SSL= openssl-devel >XBROKEN_SSL_REASON_openssl-devel= incomplete definition of type 'struct evp_cipher_ctx_st' >X >XBUILD_DEPENDS= ${LOCALBASE}/share/java/maven33/bin/mvn:devel/maven33 \ >X cmake:devel/cmake \ >X bash:shells/bash \ >X ${LOCALBASE}/protobuf25/bin/protoc:devel/protobuf25 \ >X ant:devel/apache-ant >XLIB_DEPENDS= libzstd.so:archivers/zstd \ >X libsnappy.so:archivers/snappy \ >X libsasl2.so:security/cyrus-sasl2 \ >X libprotobuf.so:devel/protobuf25 >XRUN_DEPENDS= bash:shells/bash \ >X ssid:sysutils/ssid >X >XIGNORE_FreeBSD_11= getline definition of C++ incompatible with C code >X >XCONFLICTS_INSTALL= hadoop-1* yarn >X >XUSES= cpe shebangfix compiler:c++11-lib pkgconfig ssl >XCPE_VENDOR= apache >XUSE_JAVA= yes >XJAVA_VERSION= 1.7+ >XUSE_LDCONFIG= yes >XSHEBANG_FILES= hadoop-hdfs-project/hadoop-hdfs-httpfs/src/main/sbin/httpfs.sh hadoop-hdfs-project/hadoop-hdfs-httpfs/src/main/conf/httpfs-env.sh hadoop-common-project/hadoop-kms/src/main/sbin/kms.sh hadoop-common-project/hadoop-kms/src/main/conf/kms-env.sh hadoop-tools/hadoop-sls/src/main/bin/rumen2sls.sh hadoop-tools/hadoop-sls/src/main/bin/slsrun.sh >XMAKE_ENV+= JAVA_HOME=${JAVA_HOME} HADOOP_PROTOC_PATH=${LOCALBASE}/protobuf25/bin/protoc >XMAKE_ARGS+= CXXFLAGS="${CXXFLAGS} -fPIC" >X >XOPTIONS_DEFINE= EXAMPLES >X >XTOMCAT_VERSION= 6.0.53 >XHADOOP_DIST= ${WRKSRC}/hadoop-dist/target/hadoop-${PORTVERSION} >X >XHADOOP_LOGDIR= /var/log/hadoop >XHADOOP_RUNDIR= /var/run/hadoop >X >XHDFS_USER= hdfs >XMAPRED_USER= mapred >XHADOOP_GROUP= hadoop >XUSERS= ${HDFS_USER} ${MAPRED_USER} >XGROUPS= ${HADOOP_GROUP} >X >XSUB_FILES= hadoop-layout.sh httpfs-env.sh kms-env.sh >XUSE_RC_SUBR= historyserver nodemanager resourcemanager webappproxyserver datanode namenode secondarynamenode journalnode zkfc >X >XPLIST_SUB= PORTVERSION="${PORTVERSION}" \ >X HADOOP_LOGDIR="${HADOOP_LOGDIR}" \ >X HADOOP_RUNDIR="${HADOOP_RUNDIR}" \ >X HDFS_USER="${HDFS_USER}" \ >X MAPRED_USER="${MAPRED_USER}" \ >X HADOOP_GROUP="${HADOOP_GROUP}" >XSUB_LIST= HDFS_USER="${HDFS_USER}" \ >X MAPRED_USER="${MAPRED_USER}" \ >X HADOOP_GROUP="${HADOOP_GROUP}" \ >X JAVA_HOME="${JAVA_HOME}" \ >X HADOOP_LOGDIR="${HADOOP_LOGDIR}" \ >X HADOOP_RUNDIR="${HADOOP_RUNDIR}" >X >Xpost-patch: >X ${REINPLACE_CMD} -e "s#/bin/bash#${LOCALBASE}/bin/bash#" ${WRKSRC}/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/java/org/apache/hadoop/yarn/server/nodemanager/DefaultContainerExecutor.java ${WRKSRC}/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/src/main/java/org/apache/hadoop/mapreduce/MRJobConfig.java ${WRKSRC}/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/java/org/apache/hadoop/yarn/server/nodemanager/containermanager/launcher/ContainerLaunch.java ${WRKSRC}/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/Shell.java ${WRKSRC}/hadoop-common-project/hadoop-common/src/main/bin/hadoop-daemon.sh >X ${RM} ${WRKSRC}/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-nativetask/src/main/native/src/lib/primitives.h >X >Xdo-build: >X ${MKDIR} ${WRKSRC}/hadoop-hdfs-project/hadoop-hdfs-httpfs/downloads >X #${CP} ${DISTDIR}/${DIST_SUBDIR}/apache-tomcat-${TOMCAT_VERSION}.tar.gz ${WRKSRC}/hadoop-hdfs-project/hadoop-hdfs-httpfs/downloads/ >X ${MKDIR} ${WRKSRC}/hadoop-common-project/hadoop-kms/downloads >X ${CP} ${DISTDIR}/${DIST_SUBDIR}/apache-tomcat-${TOMCAT_VERSION}.tar.gz ${WRKSRC}/hadoop-common-project/hadoop-kms/downloads/ >X # XXX: Remove skipping clean, javadoc; add docs to -P >X cd ${WRKSRC} && ${SETENV} ${MAKE_ENV} ${LOCALBASE}/share/java/maven33/bin/mvn -Dmaven.javadoc.skip=true -Dmaven.clean.skip=true -Dmaven.repo.local=${WRKDIR}/m2 package -Pdist,native -DskipTests -Drequire.snappy -Dsnappy.prefix=${LOCALBASE} -Drequire.openssl -Drequire.zstd -Dzstd.prefix=${LOCALBASE} >X >Xpost-build: >X ${RM} ${HADOOP_DIST}/etc/hadoop/*.cmd >X ${RM} ${HADOOP_DIST}/etc/hadoop/*.cmd >X# TODO xmj: reinvestigate this >X# # With jetty-6.1.26 tasktracker's threads hung with the following error: >X# # org.mortbay.io.nio.SelectorManager$SelectSet@abdcc1c JVM BUG(s) - injecting delay 59 times >X# # See https://issues.apache.org/jira/browse/MAPREDUCE-2386 >X# .for dir in share/hadoop/common/lib share/hadoop/hdfs/lib share/hadoop/yarn/lib share/hadoop/tools/lib >X# ${RM} ${HADOOP_DIST}/${dir}/jetty-util-6.1.26.jar ${HADOOP_DIST}/${dir}/jetty-6.1.26.jar >X# ${CP} ${WRKDIR}/jetty-6.1.14/lib/jetty-6.1.14.jar ${WRKDIR}/jetty-6.1.14/lib/jetty-util-6.1.14.jar ${HADOOP_DIST}/${dir}/ >X# .endfor >X# .for dir in share/hadoop/httpfs/tomcat/webapps/webhdfs/WEB-INF/lib share/hadoop/kms/tomcat/webapps/kms/WEB-INF/lib >X# ${RM} ${HADOOP_DIST}/${dir}/jetty-util-6.1.26.jar >X# ${CP} ${WRKDIR}/jetty-6.1.14/lib/jetty-util-6.1.14.jar ${HADOOP_DIST}/${dir}/ >X# .endfor >X >Xdo-install: >X cd ${HADOOP_DIST}/bin && ${INSTALL_SCRIPT} hadoop hdfs mapred yarn ${STAGEDIR}${PREFIX}/bin/ >X cd ${HADOOP_DIST} && ${COPYTREE_BIN} "libexec sbin" ${STAGEDIR}${PREFIX}/ "! -name *.cmd" >X cd ${HADOOP_DIST}/include && ${INSTALL_DATA} *h ${STAGEDIR}${PREFIX}/include/ >X cd ${HADOOP_DIST}/lib/native && ${INSTALL_DATA} *.a ${STAGEDIR}${PREFIX}/lib/ >X cd ${HADOOP_DIST}/lib/native && ${INSTALL_DATA} libhadoop.so.1.0.0 ${STAGEDIR}${PREFIX}/lib/libhadoop.so.1.0.0 >X ${LN} -sf libhadoop.so.1.0.0 ${STAGEDIR}${PREFIX}/lib/libhadoop.so >X cd ${HADOOP_DIST}/lib/native && ${INSTALL_DATA} libnativetask.so.1.0.0 ${STAGEDIR}${PREFIX}/lib/libnativetask.so.1.0.0 >X ${LN} -sf libnativetask.so.1.0.0 ${STAGEDIR}${PREFIX}/lib/libnativetask.so >X cd ${WRKSRC}/hadoop-hdfs-project/hadoop-hdfs-native-client/target/target/usr/local/lib/ && ${INSTALL_DATA} libhdfs.so.0.0.0 ${STAGEDIR}${PREFIX}/lib/libhdfs.so.0.0.0 && ${LN} -sf libhdfs.so.0.0.0 ${STAGEDIR}${PREFIX}/lib/libhdfs.so >X cd ${WRKSRC}/hadoop-hdfs-project/hadoop-hdfs-native-client/target/main/native/libhdfspp/ && ${INSTALL_DATA} libhdfspp.so.0.1.0 ${STAGEDIR}${PREFIX}/lib/libhdfspp.so.0.1.0 && ${LN} -sf libhdfspp.so.0.1.0 ${STAGEDIR}${PREFIX}/lib/libhdfspp.so >X cd ${HADOOP_DIST}/share/hadoop && ${COPYTREE_SHARE} "*" ${STAGEDIR}${DATADIR}/ "! -name *-sources.jar -and ! -name sources" >X #${CHMOD} a+x ${STAGEDIR}${DATADIR}/kms/tomcat/bin/*.sh ${STAGEDIR}${DATADIR}/httpfs/tomcat/bin/*.sh >X ${MKDIR} ${STAGEDIR}${EXAMPLESDIR}/conf >X cd ${HADOOP_DIST}/etc/hadoop && ${COPYTREE_SHARE} "*" ${STAGEDIR}${EXAMPLESDIR}/conf/ >X ${INSTALL_DATA} ${WRKSRC}/hadoop-hdfs-project/hadoop-hdfs/target/classes/hdfs-default.xml ${WRKSRC}/hadoop-hdfs-project/hadoop-hdfs-httpfs/target/classes/httpfs-default.xml ${WRKSRC}/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common/target/classes/yarn-default.xml ${WRKSRC}/hadoop-common-project/hadoop-common/target/classes/core-default.xml ${WRKSRC}/hadoop-tools/hadoop-distcp/target/classes/distcp-default.xml ${WRKSRC}/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/target/classes/mapred-default.xml ${STAGEDIR}/${EXAMPLESDIR}/ >X ${INSTALL_DATA} ${WRKDIR}/hadoop-layout.sh ${STAGEDIR}${PREFIX}/libexec/ >X ${MKDIR} ${STAGEDIR}${ETCDIR} >X ${INSTALL_DATA} ${WRKDIR}/httpfs-env.sh ${STAGEDIR}${ETCDIR} >X ${INSTALL_DATA} ${WRKDIR}/kms-env.sh ${STAGEDIR}${ETCDIR} >X ${INSTALL_DATA} ${HADOOP_DIST}/etc/hadoop/core-site.xml ${STAGEDIR}${ETCDIR} >X ${INSTALL_DATA} ${HADOOP_DIST}/etc/hadoop/log4j.properties ${STAGEDIR}${ETCDIR} >X ${MKDIR} ${STAGEDIR}${HADOOP_LOGDIR} >X ${MKDIR} ${STAGEDIR}${HADOOP_RUNDIR} >X >X.include <bsd.port.mk> >2ffa086fbc1c64775833e7239b4785ed >echo x - hadoop3/.Makefile.swp >sed 's/^X//' >hadoop3/.Makefile.swp << '9af5374d40df0b1d30bfcbed966def13' >Xb0VIM 8.0x¨<]$yØôroot02.perceivon.net/usr/ports/devel/hadoop3/Makefile 3210#"! UtpWÿÿÿÿÿÿÿÿ&X~ad¥Wȼ»v(൤z:É ~ M 7 & ýjV-ËʬtsYÿ >X >X >X >X >X >X >X >X >X ÓÀ²*Ò©¨v65÷öæÒ½JÌË¡|W:øÓ²rM*)post-patch: HADOOP_RUNDIR="${HADOOP_RUNDIR}" HADOOP_LOGDIR="${HADOOP_LOGDIR}" \ JAVA_HOME="${JAVA_HOME}" \ HADOOP_GROUP="${HADOOP_GROUP}" \ MAPRED_USER="${MAPRED_USER}" \SUB_LIST= HDFS_USER="${HDFS_USER}" \ HADOOP_GROUP="${HADOOP_GROUP}" MAPRED_USER="${MAPRED_USER}" \ HDFS_USER="${HDFS_USER}" \ HADOOP_RUNDIR="${HADOOP_RUNDIR}" \ HADOOP_LOGDIR="${HADOOP_LOGDIR}" \PLIST_SUB= PORTVERSION="${PORTVERSION}" \USE_RC_SUBR= historyserver nodemanager resourcemanager webappproxyserver datanode namenode secondarynamenode journalnode zkfcSUB_FILES= hadoop-layout.sh httpfs-env.sh kms-env.shGROUPS= ${HADOOP_GROUP}USERS= ${HDFS_USER} ${MAPRED_USER}HADOOP_GROUP= hadoopMAPRED_USER= mapredHDFS_USER= hdfsHADOOP_RUNDIR= /var/run/hadoopHADOOP_LOGDIR= /var/log/hadoopHADOOP_DIST= ${WRKSRC}/hadoop-dist/target/hadoop-${PORTVERSION}TOMCAT_VERSION= 6.0.53OPTIONS_DEFINE= EXAMPLESMAKE_ARGS+= CXXFLAGS="${CXXFLAGS} -fPIC"MAKE_ENV+= JAVA_HOME=${JAVA_HOME} HADOOP_PROTOC_PATH=${LOCALBASE}/protobuf25/bin/protocSHEBANG_FILES= hadoop-hdfs-project/hadoop-hdfs-httpfs/src/main/sbin/httpfs.sh hadoop-hdfs-project/hadoop-hdfs-httpfs/src/main/conf/httpfs-env.sh hadoop-common-project/hadoop-kms/src/main/sbin/kms.sh hadoop-common-project/hadoop-kms/src/main/conf/kms-env.sh hadoop-tools/hadoop-sls/src/main/bin/rumen2sls.sh hadoop-tools/hadoop-sls/src/main/bin/slsrun.shUSE_LDCONFIG= yesJAVA_VERSION= 1.7+USE_JAVA= yesCPE_VENDOR= apacheUSES= cpe shebangfix compiler:c++11-lib pkgconfig sslCONFLICTS_INSTALL= hadoop-1* yarnIGNORE_FreeBSD_11= getline definition of C++ incompatible with C code ssid:sysutils/ssidRUN_DEPENDS= bash:shells/bash \ libprotobuf.so:devel/protobuf25 libsasl2.so:security/cyrus-sasl2 \ libsnappy.so:archivers/snappy \LIB_DEPENDS= libzstd.so:archivers/zstd \ ant:devel/apache-ant ${LOCALBASE}/protobuf25/bin/protoc:devel/protobuf25 \ bash:shells/bash \ cmake:devel/cmake \BUILD_DEPENDS= ${LOCALBASE}/share/java/maven33/bin/mvn:devel/maven33 \BROKEN_SSL_REASON_openssl-devel= incomplete definition of type 'struct evp_cipher_ctx_st'BROKEN_SSL= openssl-develLICENSE= APACHE20COMMENT= Apache Map/Reduce frameworkMAINTAINER= demon@FreeBSD.orgEXTRACT_ONLY= ${DISTNAME}${EXTRACT_SUFX} FreeBSD-${PORTNAME}3-${PORTVERSION}-maven-repository.tgzEXTRACT_ONLY= ${DISTNAME}${EXTRACT_SUFX}DIST_SUBDIR= hadoopDISTFILES= ${DISTNAME}${EXTRACT_SUFX} apache-tomcat-${TOMCAT_VERSION}.tar.gz:tomcat FreeBSD-${PORTNAME}3-${PORTVERSION}-maven-repository.tgz:mavenDISTNAME= ${PORTNAME}-${PORTVERSION}-srcPKGNAMESUFFIX= 3 #LOCAL/demon/:jetty #LOCAL/demon/:maven \ # and change this site # please mirror the binary hosted on xmj.me in the local-distfiles space http://archive.apache.org/dist/tomcat/tomcat-6/v${TOMCAT_VERSION}/bin/:tomcat http://xmj.me/freebsd/:maven \MASTER_SITES= APACHE/${PORTNAME}/common/hadoop-${PORTVERSION} \CATEGORIES= devel javaPORTVERSION= 3.2.0PORTNAME= hadoop# no problems encountered so far. Thanks.# in any of it's log files. This version was tested under the load and# distributed mode and to run few map/reduce. Be sure there are no exception# Please do not submit untested updates. Be sure to start hadoop in# $FreeBSD$# Created by: Johannes Meixner <johannes@perceivon.net>ad) >X >X >X >X >X >X >X >X9af5374d40df0b1d30bfcbed966def13 >echo x - hadoop3/distinfo >sed 's/^X//' >hadoop3/distinfo << 'd9341435e7dbb85b635c430fcc8351fa' >XTIMESTAMP = 1555851034 >XSHA256 (hadoop/hadoop-3.2.0-src.tar.gz) = c30d448d3712b518e892efdc189e7b3f81c4ce4b6532ebb981515f016f735568 >XSIZE (hadoop/hadoop-3.2.0-src.tar.gz) = 30751465 >XSHA256 (hadoop/apache-tomcat-6.0.53.tar.gz) = 35249a4b40f41fb5f602f5602142d59faaa96dc1567df807d108d4d2b942e2f0 >XSIZE (hadoop/apache-tomcat-6.0.53.tar.gz) = 7110610 >XSHA256 (hadoop/FreeBSD-hadoop3-3.2.0-maven-repository.tgz) = 8e94bf2e031297f31ffa625c02fafeb9f13650a94b68137b5587b65ba0700a30 >XSIZE (hadoop/FreeBSD-hadoop3-3.2.0-maven-repository.tgz) = 289859701 >d9341435e7dbb85b635c430fcc8351fa >echo x - hadoop3/pkg-plist >sed 's/^X//' >hadoop3/pkg-plist << '2c0c13c964ec79c7d10ac17ff4d6682e' >Xbin/%%HADOOP_GROUP%% >Xbin/%%HDFS_USER%% >Xbin/%%MAPRED_USER%% >Xbin/yarn >X%%ETCDIR%%/core-site.xml >X%%ETCDIR%%/httpfs-env.sh >X%%ETCDIR%%/kms-env.sh >X%%ETCDIR%%/log4j.properties >Xinclude/Pipes.hh >Xinclude/SerialUtils.hh >Xinclude/StringUtils.hh >Xinclude/TemplateFactory.hh >Xinclude/%%HDFS_USER%%.h >Xlib/lib%%HADOOP_GROUP%%.a >Xlib/lib%%HADOOP_GROUP%%.so >Xlib/lib%%HADOOP_GROUP%%.so.1.0.0 >Xlib/lib%%HADOOP_GROUP%%pipes.a >Xlib/lib%%HADOOP_GROUP%%utils.a >Xlib/lib%%HDFS_USER%%.so >Xlib/lib%%HDFS_USER%%.so.0.0.0 >Xlib/lib%%HDFS_USER%%pp.so >Xlib/lib%%HDFS_USER%%pp.so.0.1.0 >Xlib/libnativetask.a >Xlib/libnativetask.so >Xlib/libnativetask.so.1.0.0 >Xlibexec/%%HADOOP_GROUP%%-config.sh >Xlibexec/%%HADOOP_GROUP%%-functions.sh >Xlibexec/%%HADOOP_GROUP%%-layout.sh >Xlibexec/%%HADOOP_GROUP%%-layout.sh.example >Xlibexec/%%HDFS_USER%%-config.sh >Xlibexec/%%MAPRED_USER%%-config.sh >Xlibexec/shellprofile.d/%%HADOOP_GROUP%%-aliyun.sh >Xlibexec/shellprofile.d/%%HADOOP_GROUP%%-archive-logs.sh >Xlibexec/shellprofile.d/%%HADOOP_GROUP%%-archives.sh >Xlibexec/shellprofile.d/%%HADOOP_GROUP%%-aws.sh >Xlibexec/shellprofile.d/%%HADOOP_GROUP%%-azure-datalake.sh >Xlibexec/shellprofile.d/%%HADOOP_GROUP%%-azure.sh >Xlibexec/shellprofile.d/%%HADOOP_GROUP%%-distcp.sh >Xlibexec/shellprofile.d/%%HADOOP_GROUP%%-extras.sh >Xlibexec/shellprofile.d/%%HADOOP_GROUP%%-gridmix.sh >Xlibexec/shellprofile.d/%%HADOOP_GROUP%%-%%HDFS_USER%%.sh >Xlibexec/shellprofile.d/%%HADOOP_GROUP%%-httpfs.sh >Xlibexec/shellprofile.d/%%HADOOP_GROUP%%-kafka.sh >Xlibexec/shellprofile.d/%%HADOOP_GROUP%%-kms.sh >Xlibexec/shellprofile.d/%%HADOOP_GROUP%%-%%MAPRED_USER%%uce.sh >Xlibexec/shellprofile.d/%%HADOOP_GROUP%%-openstack.sh >Xlibexec/shellprofile.d/%%HADOOP_GROUP%%-rumen.sh >Xlibexec/shellprofile.d/%%HADOOP_GROUP%%-s3guard.sh >Xlibexec/shellprofile.d/%%HADOOP_GROUP%%-streaming.sh >Xlibexec/shellprofile.d/%%HADOOP_GROUP%%-yarn.sh >Xlibexec/tools/%%HADOOP_GROUP%%-archive-logs.sh >Xlibexec/tools/%%HADOOP_GROUP%%-archives.sh >Xlibexec/tools/%%HADOOP_GROUP%%-aws.sh >Xlibexec/tools/%%HADOOP_GROUP%%-distcp.sh >Xlibexec/tools/%%HADOOP_GROUP%%-extras.sh >Xlibexec/tools/%%HADOOP_GROUP%%-gridmix.sh >Xlibexec/tools/%%HADOOP_GROUP%%-resourceestimator.sh >Xlibexec/tools/%%HADOOP_GROUP%%-rumen.sh >Xlibexec/tools/%%HADOOP_GROUP%%-sls.sh >Xlibexec/tools/%%HADOOP_GROUP%%-streaming.sh >Xlibexec/yarn-config.sh >Xsbin/FederationStateStore/MySQL/FederationStateStoreDatabase.sql >Xsbin/FederationStateStore/MySQL/FederationStateStoreStoredProcs.sql >Xsbin/FederationStateStore/MySQL/FederationStateStoreTables.sql >Xsbin/FederationStateStore/MySQL/FederationStateStoreUser.sql >Xsbin/FederationStateStore/MySQL/dropDatabase.sql >Xsbin/FederationStateStore/MySQL/dropStoreProcedures.sql >Xsbin/FederationStateStore/MySQL/dropTables.sql >Xsbin/FederationStateStore/MySQL/dropUser.sql >Xsbin/FederationStateStore/SQLServer/FederationStateStoreStoreProcs.sql >Xsbin/FederationStateStore/SQLServer/FederationStateStoreTables.sql >Xsbin/distribute-exclude.sh >Xsbin/%%HADOOP_GROUP%%-daemon.sh >Xsbin/%%HADOOP_GROUP%%-daemons.sh >Xsbin/httpfs.sh >Xsbin/kms.sh >Xsbin/mr-jobhistory-daemon.sh >Xsbin/refresh-namenodes.sh >Xsbin/start-all.sh >Xsbin/start-balancer.sh >Xsbin/start-dfs.sh >Xsbin/start-secure-dns.sh >Xsbin/start-yarn.sh >Xsbin/stop-all.sh >Xsbin/stop-balancer.sh >Xsbin/stop-dfs.sh >Xsbin/stop-secure-dns.sh >Xsbin/stop-yarn.sh >Xsbin/workers.sh >Xsbin/yarn-daemon.sh >Xsbin/yarn-daemons.sh >X%%PORTEXAMPLES%%%%EXAMPLESDIR%%/conf/capacity-scheduler.xml >X%%PORTEXAMPLES%%%%EXAMPLESDIR%%/conf/configuration.xsl >X%%PORTEXAMPLES%%%%EXAMPLESDIR%%/conf/container-executor.cfg >X%%PORTEXAMPLES%%%%EXAMPLESDIR%%/conf/core-site.xml >X%%PORTEXAMPLES%%%%EXAMPLESDIR%%/conf/%%HADOOP_GROUP%%-env.sh >X%%PORTEXAMPLES%%%%EXAMPLESDIR%%/conf/%%HADOOP_GROUP%%-metrics2.properties >X%%PORTEXAMPLES%%%%EXAMPLESDIR%%/conf/%%HADOOP_GROUP%%-policy.xml >X%%PORTEXAMPLES%%%%EXAMPLESDIR%%/conf/%%HADOOP_GROUP%%-user-functions.sh.example >X%%PORTEXAMPLES%%%%EXAMPLESDIR%%/conf/%%HDFS_USER%%-site.xml >X%%PORTEXAMPLES%%%%EXAMPLESDIR%%/conf/httpfs-env.sh >X%%PORTEXAMPLES%%%%EXAMPLESDIR%%/conf/httpfs-log4j.properties >X%%PORTEXAMPLES%%%%EXAMPLESDIR%%/conf/httpfs-signature.secret >X%%PORTEXAMPLES%%%%EXAMPLESDIR%%/conf/httpfs-site.xml >X%%PORTEXAMPLES%%%%EXAMPLESDIR%%/conf/kms-acls.xml >X%%PORTEXAMPLES%%%%EXAMPLESDIR%%/conf/kms-env.sh >X%%PORTEXAMPLES%%%%EXAMPLESDIR%%/conf/kms-log4j.properties >X%%PORTEXAMPLES%%%%EXAMPLESDIR%%/conf/kms-site.xml >X%%PORTEXAMPLES%%%%EXAMPLESDIR%%/conf/log4j.properties >X%%PORTEXAMPLES%%%%EXAMPLESDIR%%/conf/%%MAPRED_USER%%-env.sh >X%%PORTEXAMPLES%%%%EXAMPLESDIR%%/conf/%%MAPRED_USER%%-queues.xml.template >X%%PORTEXAMPLES%%%%EXAMPLESDIR%%/conf/%%MAPRED_USER%%-site.xml >X%%PORTEXAMPLES%%%%EXAMPLESDIR%%/conf/shellprofile.d/example.sh >X%%PORTEXAMPLES%%%%EXAMPLESDIR%%/conf/ssl-client.xml.example >X%%PORTEXAMPLES%%%%EXAMPLESDIR%%/conf/ssl-server.xml.example >X%%PORTEXAMPLES%%%%EXAMPLESDIR%%/conf/user_ec_policies.xml.template >X%%PORTEXAMPLES%%%%EXAMPLESDIR%%/conf/workers >X%%PORTEXAMPLES%%%%EXAMPLESDIR%%/conf/yarn-env.sh >X%%PORTEXAMPLES%%%%EXAMPLESDIR%%/conf/yarn-site.xml >X%%PORTEXAMPLES%%%%EXAMPLESDIR%%/conf/yarnservice-log4j.properties >X%%PORTEXAMPLES%%%%EXAMPLESDIR%%/core-default.xml >X%%PORTEXAMPLES%%%%EXAMPLESDIR%%/distcp-default.xml >X%%PORTEXAMPLES%%%%EXAMPLESDIR%%/%%HDFS_USER%%-default.xml >X%%PORTEXAMPLES%%%%EXAMPLESDIR%%/httpfs-default.xml >X%%PORTEXAMPLES%%%%EXAMPLESDIR%%/%%MAPRED_USER%%-default.xml >X%%PORTEXAMPLES%%%%EXAMPLESDIR%%/yarn-default.xml >X%%DATADIR%%/client/%%HADOOP_GROUP%%-client-api-%%PORTVERSION%%.jar >X%%DATADIR%%/client/%%HADOOP_GROUP%%-client-minicluster-%%PORTVERSION%%.jar >X%%DATADIR%%/client/%%HADOOP_GROUP%%-client-runtime-%%PORTVERSION%%.jar >X%%DATADIR%%/common/%%HADOOP_GROUP%%-common-%%PORTVERSION%%-tests.jar >X%%DATADIR%%/common/%%HADOOP_GROUP%%-common-%%PORTVERSION%%.jar >X%%DATADIR%%/common/%%HADOOP_GROUP%%-kms-%%PORTVERSION%%.jar >X%%DATADIR%%/common/%%HADOOP_GROUP%%-nfs-%%PORTVERSION%%.jar >X%%DATADIR%%/common/jdiff/Apache_Hadoop_Common_2.6.0.xml >X%%DATADIR%%/common/jdiff/Apache_Hadoop_Common_2.7.2.xml >X%%DATADIR%%/common/jdiff/Apache_Hadoop_Common_2.8.0.xml >X%%DATADIR%%/common/jdiff/Apache_Hadoop_Common_2.8.2.xml >X%%DATADIR%%/common/jdiff/Apache_Hadoop_Common_2.8.3.xml >X%%DATADIR%%/common/jdiff/Null.java >X%%DATADIR%%/common/jdiff/%%HADOOP_GROUP%%-core_0.20.0.xml >X%%DATADIR%%/common/jdiff/%%HADOOP_GROUP%%-core_0.21.0.xml >X%%DATADIR%%/common/jdiff/%%HADOOP_GROUP%%-core_0.22.0.xml >X%%DATADIR%%/common/jdiff/%%HADOOP_GROUP%%_0.17.0.xml >X%%DATADIR%%/common/jdiff/%%HADOOP_GROUP%%_0.18.1.xml >X%%DATADIR%%/common/jdiff/%%HADOOP_GROUP%%_0.18.2.xml >X%%DATADIR%%/common/jdiff/%%HADOOP_GROUP%%_0.18.3.xml >X%%DATADIR%%/common/jdiff/%%HADOOP_GROUP%%_0.19.0.xml >X%%DATADIR%%/common/jdiff/%%HADOOP_GROUP%%_0.19.1.xml >X%%DATADIR%%/common/jdiff/%%HADOOP_GROUP%%_0.19.2.xml >X%%DATADIR%%/common/jdiff/%%HADOOP_GROUP%%_0.20.0.xml >X%%DATADIR%%/common/jdiff/%%HADOOP_GROUP%%_0.20.1.xml >X%%DATADIR%%/common/jdiff/%%HADOOP_GROUP%%_0.20.2.xml >X%%DATADIR%%/common/lib/accessors-smart-1.2.jar >X%%DATADIR%%/common/lib/asm-5.0.4.jar >X%%DATADIR%%/common/lib/audience-annotations-0.5.0.jar >X%%DATADIR%%/common/lib/avro-1.7.7.jar >X%%DATADIR%%/common/lib/commons-beanutils-1.9.3.jar >X%%DATADIR%%/common/lib/commons-cli-1.2.jar >X%%DATADIR%%/common/lib/commons-codec-1.11.jar >X%%DATADIR%%/common/lib/commons-collections-3.2.2.jar >X%%DATADIR%%/common/lib/commons-compress-1.4.1.jar >X%%DATADIR%%/common/lib/commons-configuration2-2.1.1.jar >X%%DATADIR%%/common/lib/commons-io-2.5.jar >X%%DATADIR%%/common/lib/commons-lang3-3.7.jar >X%%DATADIR%%/common/lib/commons-logging-1.1.3.jar >X%%DATADIR%%/common/lib/commons-math3-3.1.1.jar >X%%DATADIR%%/common/lib/commons-net-3.6.jar >X%%DATADIR%%/common/lib/commons-text-1.4.jar >X%%DATADIR%%/common/lib/curator-client-2.12.0.jar >X%%DATADIR%%/common/lib/curator-framework-2.12.0.jar >X%%DATADIR%%/common/lib/curator-recipes-2.12.0.jar >X%%DATADIR%%/common/lib/dnsjava-2.1.7.jar >X%%DATADIR%%/common/lib/gson-2.2.4.jar >X%%DATADIR%%/common/lib/guava-11.0.2.jar >X%%DATADIR%%/common/lib/%%HADOOP_GROUP%%-annotations-%%PORTVERSION%%.jar >X%%DATADIR%%/common/lib/%%HADOOP_GROUP%%-auth-%%PORTVERSION%%.jar >X%%DATADIR%%/common/lib/htrace-core4-4.1.0-incubating.jar >X%%DATADIR%%/common/lib/httpclient-4.5.2.jar >X%%DATADIR%%/common/lib/httpcore-4.4.4.jar >X%%DATADIR%%/common/lib/jackson-annotations-2.9.5.jar >X%%DATADIR%%/common/lib/jackson-core-2.9.5.jar >X%%DATADIR%%/common/lib/jackson-core-asl-1.9.13.jar >X%%DATADIR%%/common/lib/jackson-databind-2.9.5.jar >X%%DATADIR%%/common/lib/jackson-jaxrs-1.9.13.jar >X%%DATADIR%%/common/lib/jackson-mapper-asl-1.9.13.jar >X%%DATADIR%%/common/lib/jackson-xc-1.9.13.jar >X%%DATADIR%%/common/lib/javax.servlet-api-3.1.0.jar >X%%DATADIR%%/common/lib/jaxb-api-2.2.11.jar >X%%DATADIR%%/common/lib/jaxb-impl-2.2.3-1.jar >X%%DATADIR%%/common/lib/jcip-annotations-1.0-1.jar >X%%DATADIR%%/common/lib/jersey-core-1.19.jar >X%%DATADIR%%/common/lib/jersey-json-1.19.jar >X%%DATADIR%%/common/lib/jersey-server-1.19.jar >X%%DATADIR%%/common/lib/jersey-servlet-1.19.jar >X%%DATADIR%%/common/lib/jettison-1.1.jar >X%%DATADIR%%/common/lib/jetty-http-9.3.24.v20180605.jar >X%%DATADIR%%/common/lib/jetty-io-9.3.24.v20180605.jar >X%%DATADIR%%/common/lib/jetty-security-9.3.24.v20180605.jar >X%%DATADIR%%/common/lib/jetty-server-9.3.24.v20180605.jar >X%%DATADIR%%/common/lib/jetty-servlet-9.3.24.v20180605.jar >X%%DATADIR%%/common/lib/jetty-util-9.3.24.v20180605.jar >X%%DATADIR%%/common/lib/jetty-webapp-9.3.24.v20180605.jar >X%%DATADIR%%/common/lib/jetty-xml-9.3.24.v20180605.jar >X%%DATADIR%%/common/lib/jsch-0.1.54.jar >X%%DATADIR%%/common/lib/json-smart-2.3.jar >X%%DATADIR%%/common/lib/jsp-api-2.1.jar >X%%DATADIR%%/common/lib/jsr305-3.0.0.jar >X%%DATADIR%%/common/lib/jsr311-api-1.1.1.jar >X%%DATADIR%%/common/lib/jul-to-slf4j-1.7.25.jar >X%%DATADIR%%/common/lib/kerb-admin-1.0.1.jar >X%%DATADIR%%/common/lib/kerb-client-1.0.1.jar >X%%DATADIR%%/common/lib/kerb-common-1.0.1.jar >X%%DATADIR%%/common/lib/kerb-core-1.0.1.jar >X%%DATADIR%%/common/lib/kerb-crypto-1.0.1.jar >X%%DATADIR%%/common/lib/kerb-identity-1.0.1.jar >X%%DATADIR%%/common/lib/kerb-server-1.0.1.jar >X%%DATADIR%%/common/lib/kerb-simplekdc-1.0.1.jar >X%%DATADIR%%/common/lib/kerb-util-1.0.1.jar >X%%DATADIR%%/common/lib/kerby-asn1-1.0.1.jar >X%%DATADIR%%/common/lib/kerby-config-1.0.1.jar >X%%DATADIR%%/common/lib/kerby-pkix-1.0.1.jar >X%%DATADIR%%/common/lib/kerby-util-1.0.1.jar >X%%DATADIR%%/common/lib/kerby-xdr-1.0.1.jar >X%%DATADIR%%/common/lib/log4j-1.2.17.jar >X%%DATADIR%%/common/lib/metrics-core-3.2.4.jar >X%%DATADIR%%/common/lib/netty-3.10.5.Final.jar >X%%DATADIR%%/common/lib/nimbus-jose-jwt-4.41.1.jar >X%%DATADIR%%/common/lib/paranamer-2.3.jar >X%%DATADIR%%/common/lib/protobuf-java-2.5.0.jar >X%%DATADIR%%/common/lib/re2j-1.1.jar >X%%DATADIR%%/common/lib/slf4j-api-1.7.25.jar >X%%DATADIR%%/common/lib/slf4j-log4j12-1.7.25.jar >X%%DATADIR%%/common/lib/snappy-java-1.0.5.jar >X%%DATADIR%%/common/lib/stax2-api-3.1.4.jar >X%%DATADIR%%/common/lib/token-provider-1.0.1.jar >X%%DATADIR%%/common/lib/woodstox-core-5.0.3.jar >X%%DATADIR%%/common/lib/xz-1.0.jar >X%%DATADIR%%/common/lib/zookeeper-3.4.13.jar >X%%DATADIR%%/common/webapps/static/%%HADOOP_GROUP%%.css.gz >X%%DATADIR%%/%%HDFS_USER%%/%%HADOOP_GROUP%%-%%HDFS_USER%%-%%PORTVERSION%%-tests.jar >X%%DATADIR%%/%%HDFS_USER%%/%%HADOOP_GROUP%%-%%HDFS_USER%%-%%PORTVERSION%%.jar >X%%DATADIR%%/%%HDFS_USER%%/%%HADOOP_GROUP%%-%%HDFS_USER%%-client-%%PORTVERSION%%-tests.jar >X%%DATADIR%%/%%HDFS_USER%%/%%HADOOP_GROUP%%-%%HDFS_USER%%-client-%%PORTVERSION%%.jar >X%%DATADIR%%/%%HDFS_USER%%/%%HADOOP_GROUP%%-%%HDFS_USER%%-httpfs-%%PORTVERSION%%.jar >X%%DATADIR%%/%%HDFS_USER%%/%%HADOOP_GROUP%%-%%HDFS_USER%%-native-client-%%PORTVERSION%%-tests.jar >X%%DATADIR%%/%%HDFS_USER%%/%%HADOOP_GROUP%%-%%HDFS_USER%%-native-client-%%PORTVERSION%%.jar >X%%DATADIR%%/%%HDFS_USER%%/%%HADOOP_GROUP%%-%%HDFS_USER%%-nfs-%%PORTVERSION%%.jar >X%%DATADIR%%/%%HDFS_USER%%/%%HADOOP_GROUP%%-%%HDFS_USER%%-rbf-%%PORTVERSION%%-tests.jar >X%%DATADIR%%/%%HDFS_USER%%/%%HADOOP_GROUP%%-%%HDFS_USER%%-rbf-%%PORTVERSION%%.jar >X%%DATADIR%%/%%HDFS_USER%%/jdiff/Apache_Hadoop_HDFS_2.6.0.xml >X%%DATADIR%%/%%HDFS_USER%%/jdiff/Apache_Hadoop_HDFS_2.7.2.xml >X%%DATADIR%%/%%HDFS_USER%%/jdiff/Apache_Hadoop_HDFS_2.8.0.xml >X%%DATADIR%%/%%HDFS_USER%%/jdiff/Apache_Hadoop_HDFS_2.8.2.xml >X%%DATADIR%%/%%HDFS_USER%%/jdiff/Apache_Hadoop_HDFS_2.8.3.xml >X%%DATADIR%%/%%HDFS_USER%%/jdiff/Apache_Hadoop_HDFS_2.9.1.xml >X%%DATADIR%%/%%HDFS_USER%%/jdiff/Apache_Hadoop_HDFS_3.0.0-alpha2.xml >X%%DATADIR%%/%%HDFS_USER%%/jdiff/Apache_Hadoop_HDFS_3.0.0-alpha3.xml >X%%DATADIR%%/%%HDFS_USER%%/jdiff/Apache_Hadoop_HDFS_3.0.0-alpha4.xml >X%%DATADIR%%/%%HDFS_USER%%/jdiff/Apache_Hadoop_HDFS_3.0.0.xml >X%%DATADIR%%/%%HDFS_USER%%/jdiff/Apache_Hadoop_HDFS_3.0.1.xml >X%%DATADIR%%/%%HDFS_USER%%/jdiff/Apache_Hadoop_HDFS_3.0.2.xml >X%%DATADIR%%/%%HDFS_USER%%/jdiff/Apache_Hadoop_HDFS_3.0.3.xml >X%%DATADIR%%/%%HDFS_USER%%/jdiff/Apache_Hadoop_HDFS_3.1.0.xml >X%%DATADIR%%/%%HDFS_USER%%/jdiff/Apache_Hadoop_HDFS_3.1.1.xml >X%%DATADIR%%/%%HDFS_USER%%/jdiff/Null.java >X%%DATADIR%%/%%HDFS_USER%%/jdiff/%%HADOOP_GROUP%%-%%HDFS_USER%%_0.20.0.xml >X%%DATADIR%%/%%HDFS_USER%%/jdiff/%%HADOOP_GROUP%%-%%HDFS_USER%%_0.21.0.xml >X%%DATADIR%%/%%HDFS_USER%%/jdiff/%%HADOOP_GROUP%%-%%HDFS_USER%%_0.22.0.xml >X%%DATADIR%%/%%HDFS_USER%%/lib/accessors-smart-1.2.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/asm-5.0.4.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/audience-annotations-0.5.0.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/avro-1.7.7.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/commons-beanutils-1.9.3.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/commons-cli-1.2.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/commons-codec-1.11.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/commons-collections-3.2.2.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/commons-compress-1.4.1.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/commons-configuration2-2.1.1.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/commons-daemon-1.0.13.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/commons-io-2.5.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/commons-lang3-3.7.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/commons-logging-1.1.3.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/commons-math3-3.1.1.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/commons-net-3.6.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/commons-text-1.4.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/curator-client-2.12.0.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/curator-framework-2.12.0.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/curator-recipes-2.12.0.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/dnsjava-2.1.7.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/gson-2.2.4.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/guava-11.0.2.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/%%HADOOP_GROUP%%-annotations-%%PORTVERSION%%.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/%%HADOOP_GROUP%%-auth-%%PORTVERSION%%.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/htrace-core4-4.1.0-incubating.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/httpclient-4.5.2.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/httpcore-4.4.4.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/jackson-annotations-2.9.5.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/jackson-core-2.9.5.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/jackson-core-asl-1.9.13.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/jackson-databind-2.9.5.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/jackson-jaxrs-1.9.13.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/jackson-mapper-asl-1.9.13.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/jackson-xc-1.9.13.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/javax.servlet-api-3.1.0.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/jaxb-api-2.2.11.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/jaxb-impl-2.2.3-1.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/jcip-annotations-1.0-1.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/jersey-core-1.19.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/jersey-json-1.19.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/jersey-server-1.19.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/jersey-servlet-1.19.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/jettison-1.1.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/jetty-http-9.3.24.v20180605.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/jetty-io-9.3.24.v20180605.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/jetty-security-9.3.24.v20180605.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/jetty-server-9.3.24.v20180605.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/jetty-servlet-9.3.24.v20180605.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/jetty-util-9.3.24.v20180605.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/jetty-util-ajax-9.3.24.v20180605.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/jetty-webapp-9.3.24.v20180605.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/jetty-xml-9.3.24.v20180605.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/jsch-0.1.54.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/json-simple-1.1.1.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/json-smart-2.3.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/jsr305-3.0.0.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/jsr311-api-1.1.1.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/kerb-admin-1.0.1.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/kerb-client-1.0.1.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/kerb-common-1.0.1.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/kerb-core-1.0.1.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/kerb-crypto-1.0.1.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/kerb-identity-1.0.1.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/kerb-server-1.0.1.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/kerb-simplekdc-1.0.1.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/kerb-util-1.0.1.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/kerby-asn1-1.0.1.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/kerby-config-1.0.1.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/kerby-pkix-1.0.1.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/kerby-util-1.0.1.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/kerby-xdr-1.0.1.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/leveldbjni-all-1.8.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/log4j-1.2.17.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/netty-3.10.5.Final.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/netty-all-4.0.52.Final.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/nimbus-jose-jwt-4.41.1.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/okhttp-2.7.5.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/okio-1.6.0.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/paranamer-2.3.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/protobuf-java-2.5.0.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/re2j-1.1.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/snappy-java-1.0.5.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/stax2-api-3.1.4.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/token-provider-1.0.1.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/woodstox-core-5.0.3.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/xz-1.0.jar >X%%DATADIR%%/%%HDFS_USER%%/lib/zookeeper-3.4.13.jar >X%%DATADIR%%/%%HDFS_USER%%/webapps/datanode/WEB-INF/web.xml >X%%DATADIR%%/%%HDFS_USER%%/webapps/datanode/datanode.html >X%%DATADIR%%/%%HDFS_USER%%/webapps/datanode/dn.js >X%%DATADIR%%/%%HDFS_USER%%/webapps/datanode/index.html >X%%DATADIR%%/%%HDFS_USER%%/webapps/datanode/robots.txt >X%%DATADIR%%/%%HDFS_USER%%/webapps/%%HDFS_USER%%/WEB-INF/web.xml >X%%DATADIR%%/%%HDFS_USER%%/webapps/%%HDFS_USER%%/dfshealth.html >X%%DATADIR%%/%%HDFS_USER%%/webapps/%%HDFS_USER%%/dfshealth.js >X%%DATADIR%%/%%HDFS_USER%%/webapps/%%HDFS_USER%%/explorer.html >X%%DATADIR%%/%%HDFS_USER%%/webapps/%%HDFS_USER%%/explorer.js >X%%DATADIR%%/%%HDFS_USER%%/webapps/%%HDFS_USER%%/index.html >X%%DATADIR%%/%%HDFS_USER%%/webapps/%%HDFS_USER%%/robots.txt >X%%DATADIR%%/%%HDFS_USER%%/webapps/journal/WEB-INF/web.xml >X%%DATADIR%%/%%HDFS_USER%%/webapps/journal/index.html >X%%DATADIR%%/%%HDFS_USER%%/webapps/journal/robots.txt >X%%DATADIR%%/%%HDFS_USER%%/webapps/nfs3/WEB-INF/web.xml >X%%DATADIR%%/%%HDFS_USER%%/webapps/router/WEB-INF/web.xml >X%%DATADIR%%/%%HDFS_USER%%/webapps/router/federationhealth.html >X%%DATADIR%%/%%HDFS_USER%%/webapps/router/federationhealth.js >X%%DATADIR%%/%%HDFS_USER%%/webapps/router/index.html >X%%DATADIR%%/%%HDFS_USER%%/webapps/router/robots.txt >X%%DATADIR%%/%%HDFS_USER%%/webapps/secondary/WEB-INF/web.xml >X%%DATADIR%%/%%HDFS_USER%%/webapps/secondary/index.html >X%%DATADIR%%/%%HDFS_USER%%/webapps/secondary/robots.txt >X%%DATADIR%%/%%HDFS_USER%%/webapps/secondary/snn.js >X%%DATADIR%%/%%HDFS_USER%%/webapps/secondary/status.html >X%%DATADIR%%/%%HDFS_USER%%/webapps/static/bootstrap-3.3.7/css/bootstrap-editable.css >X%%DATADIR%%/%%HDFS_USER%%/webapps/static/bootstrap-3.3.7/css/bootstrap-editable.css.gz >X%%DATADIR%%/%%HDFS_USER%%/webapps/static/bootstrap-3.3.7/css/bootstrap-theme.css >X%%DATADIR%%/%%HDFS_USER%%/webapps/static/bootstrap-3.3.7/css/bootstrap-theme.css.gz >X%%DATADIR%%/%%HDFS_USER%%/webapps/static/bootstrap-3.3.7/css/bootstrap-theme.css.map >X%%DATADIR%%/%%HDFS_USER%%/webapps/static/bootstrap-3.3.7/css/bootstrap-theme.min.css >X%%DATADIR%%/%%HDFS_USER%%/webapps/static/bootstrap-3.3.7/css/bootstrap-theme.min.css.gz >X%%DATADIR%%/%%HDFS_USER%%/webapps/static/bootstrap-3.3.7/css/bootstrap-theme.min.css.map >X%%DATADIR%%/%%HDFS_USER%%/webapps/static/bootstrap-3.3.7/css/bootstrap.css >X%%DATADIR%%/%%HDFS_USER%%/webapps/static/bootstrap-3.3.7/css/bootstrap.css.gz >X%%DATADIR%%/%%HDFS_USER%%/webapps/static/bootstrap-3.3.7/css/bootstrap.css.map >X%%DATADIR%%/%%HDFS_USER%%/webapps/static/bootstrap-3.3.7/css/bootstrap.min.css >X%%DATADIR%%/%%HDFS_USER%%/webapps/static/bootstrap-3.3.7/css/bootstrap.min.css.gz >X%%DATADIR%%/%%HDFS_USER%%/webapps/static/bootstrap-3.3.7/css/bootstrap.min.css.map >X%%DATADIR%%/%%HDFS_USER%%/webapps/static/bootstrap-3.3.7/fonts/glyphicons-halflings-regular.eot >X%%DATADIR%%/%%HDFS_USER%%/webapps/static/bootstrap-3.3.7/fonts/glyphicons-halflings-regular.svg >X%%DATADIR%%/%%HDFS_USER%%/webapps/static/bootstrap-3.3.7/fonts/glyphicons-halflings-regular.ttf >X%%DATADIR%%/%%HDFS_USER%%/webapps/static/bootstrap-3.3.7/fonts/glyphicons-halflings-regular.woff >X%%DATADIR%%/%%HDFS_USER%%/webapps/static/bootstrap-3.3.7/fonts/glyphicons-halflings-regular.woff2 >X%%DATADIR%%/%%HDFS_USER%%/webapps/static/bootstrap-3.3.7/js/bootstrap-editable.min.js >X%%DATADIR%%/%%HDFS_USER%%/webapps/static/bootstrap-3.3.7/js/bootstrap-editable.min.js.gz >X%%DATADIR%%/%%HDFS_USER%%/webapps/static/bootstrap-3.3.7/js/bootstrap.js >X%%DATADIR%%/%%HDFS_USER%%/webapps/static/bootstrap-3.3.7/js/bootstrap.js.gz >X%%DATADIR%%/%%HDFS_USER%%/webapps/static/bootstrap-3.3.7/js/bootstrap.min.js >X%%DATADIR%%/%%HDFS_USER%%/webapps/static/bootstrap-3.3.7/js/bootstrap.min.js.gz >X%%DATADIR%%/%%HDFS_USER%%/webapps/static/bootstrap-3.3.7/js/npm.js >X%%DATADIR%%/%%HDFS_USER%%/webapps/static/bootstrap-3.3.7/js/npm.js.gz >X%%DATADIR%%/%%HDFS_USER%%/webapps/static/d3-v4.1.1.min.js >X%%DATADIR%%/%%HDFS_USER%%/webapps/static/d3-v4.1.1.min.js.gz >X%%DATADIR%%/%%HDFS_USER%%/webapps/static/dataTables.bootstrap.css >X%%DATADIR%%/%%HDFS_USER%%/webapps/static/dataTables.bootstrap.css.gz >X%%DATADIR%%/%%HDFS_USER%%/webapps/static/dataTables.bootstrap.js >X%%DATADIR%%/%%HDFS_USER%%/webapps/static/dataTables.bootstrap.js.gz >X%%DATADIR%%/%%HDFS_USER%%/webapps/static/dfs-dust.js >X%%DATADIR%%/%%HDFS_USER%%/webapps/static/dfs-dust.js.gz >X%%DATADIR%%/%%HDFS_USER%%/webapps/static/dust-full-2.0.0.min.js >X%%DATADIR%%/%%HDFS_USER%%/webapps/static/dust-full-2.0.0.min.js.gz >X%%DATADIR%%/%%HDFS_USER%%/webapps/static/dust-helpers-1.1.1.min.js >X%%DATADIR%%/%%HDFS_USER%%/webapps/static/dust-helpers-1.1.1.min.js.gz >X%%DATADIR%%/%%HDFS_USER%%/webapps/static/%%HADOOP_GROUP%%.css >X%%DATADIR%%/%%HDFS_USER%%/webapps/static/%%HADOOP_GROUP%%.css.gz >X%%DATADIR%%/%%HDFS_USER%%/webapps/static/jquery-3.3.1.min.js >X%%DATADIR%%/%%HDFS_USER%%/webapps/static/jquery-3.3.1.min.js.gz >X%%DATADIR%%/%%HDFS_USER%%/webapps/static/jquery.dataTables.min.js >X%%DATADIR%%/%%HDFS_USER%%/webapps/static/jquery.dataTables.min.js.gz >X%%DATADIR%%/%%HDFS_USER%%/webapps/static/json-bignum.js >X%%DATADIR%%/%%HDFS_USER%%/webapps/static/json-bignum.js.gz >X%%DATADIR%%/%%HDFS_USER%%/webapps/static/moment.min.js >X%%DATADIR%%/%%HDFS_USER%%/webapps/static/moment.min.js.gz >X%%DATADIR%%/%%HDFS_USER%%/webapps/static/rbf.css >X%%DATADIR%%/%%HDFS_USER%%/webapps/static/rest-csrf.js >X%%DATADIR%%/%%HDFS_USER%%/webapps/static/rest-csrf.js.gz >X%%DATADIR%%/%%MAPRED_USER%%uce/%%HADOOP_GROUP%%-%%MAPRED_USER%%uce-client-app-%%PORTVERSION%%.jar >X%%DATADIR%%/%%MAPRED_USER%%uce/%%HADOOP_GROUP%%-%%MAPRED_USER%%uce-client-common-%%PORTVERSION%%.jar >X%%DATADIR%%/%%MAPRED_USER%%uce/%%HADOOP_GROUP%%-%%MAPRED_USER%%uce-client-core-%%PORTVERSION%%.jar >X%%DATADIR%%/%%MAPRED_USER%%uce/%%HADOOP_GROUP%%-%%MAPRED_USER%%uce-client-hs-%%PORTVERSION%%.jar >X%%DATADIR%%/%%MAPRED_USER%%uce/%%HADOOP_GROUP%%-%%MAPRED_USER%%uce-client-hs-plugins-%%PORTVERSION%%.jar >X%%DATADIR%%/%%MAPRED_USER%%uce/%%HADOOP_GROUP%%-%%MAPRED_USER%%uce-client-jobclient-%%PORTVERSION%%-tests.jar >X%%DATADIR%%/%%MAPRED_USER%%uce/%%HADOOP_GROUP%%-%%MAPRED_USER%%uce-client-jobclient-%%PORTVERSION%%.jar >X%%DATADIR%%/%%MAPRED_USER%%uce/%%HADOOP_GROUP%%-%%MAPRED_USER%%uce-client-nativetask-%%PORTVERSION%%.jar >X%%DATADIR%%/%%MAPRED_USER%%uce/%%HADOOP_GROUP%%-%%MAPRED_USER%%uce-client-shuffle-%%PORTVERSION%%.jar >X%%DATADIR%%/%%MAPRED_USER%%uce/%%HADOOP_GROUP%%-%%MAPRED_USER%%uce-client-uploader-%%PORTVERSION%%.jar >X%%DATADIR%%/%%MAPRED_USER%%uce/%%HADOOP_GROUP%%-%%MAPRED_USER%%uce-examples-%%PORTVERSION%%.jar >X%%DATADIR%%/%%MAPRED_USER%%uce/jdiff/Apache_Hadoop_MapReduce_Common_2.6.0.xml >X%%DATADIR%%/%%MAPRED_USER%%uce/jdiff/Apache_Hadoop_MapReduce_Common_2.7.2.xml >X%%DATADIR%%/%%MAPRED_USER%%uce/jdiff/Apache_Hadoop_MapReduce_Common_2.8.0.xml >X%%DATADIR%%/%%MAPRED_USER%%uce/jdiff/Apache_Hadoop_MapReduce_Common_2.8.2.xml >X%%DATADIR%%/%%MAPRED_USER%%uce/jdiff/Apache_Hadoop_MapReduce_Common_2.8.3.xml >X%%DATADIR%%/%%MAPRED_USER%%uce/jdiff/Apache_Hadoop_MapReduce_Common_3.1.0.xml >X%%DATADIR%%/%%MAPRED_USER%%uce/jdiff/Apache_Hadoop_MapReduce_Core_2.6.0.xml >X%%DATADIR%%/%%MAPRED_USER%%uce/jdiff/Apache_Hadoop_MapReduce_Core_2.7.2.xml >X%%DATADIR%%/%%MAPRED_USER%%uce/jdiff/Apache_Hadoop_MapReduce_Core_2.8.0.xml >X%%DATADIR%%/%%MAPRED_USER%%uce/jdiff/Apache_Hadoop_MapReduce_Core_2.8.2.xml >X%%DATADIR%%/%%MAPRED_USER%%uce/jdiff/Apache_Hadoop_MapReduce_Core_2.8.3.xml >X%%DATADIR%%/%%MAPRED_USER%%uce/jdiff/Apache_Hadoop_MapReduce_Core_3.1.0.xml >X%%DATADIR%%/%%MAPRED_USER%%uce/jdiff/Apache_Hadoop_MapReduce_JobClient_2.6.0.xml >X%%DATADIR%%/%%MAPRED_USER%%uce/jdiff/Apache_Hadoop_MapReduce_JobClient_2.7.2.xml >X%%DATADIR%%/%%MAPRED_USER%%uce/jdiff/Apache_Hadoop_MapReduce_JobClient_2.8.0.xml >X%%DATADIR%%/%%MAPRED_USER%%uce/jdiff/Apache_Hadoop_MapReduce_JobClient_2.8.2.xml >X%%DATADIR%%/%%MAPRED_USER%%uce/jdiff/Apache_Hadoop_MapReduce_JobClient_2.8.3.xml >X%%DATADIR%%/%%MAPRED_USER%%uce/jdiff/Apache_Hadoop_MapReduce_JobClient_3.1.0.xml >X%%DATADIR%%/%%MAPRED_USER%%uce/jdiff/Null.java >X%%DATADIR%%/%%MAPRED_USER%%uce/lib-examples/hsqldb-2.3.4.jar >X%%DATADIR%%/%%MAPRED_USER%%uce/lib/hamcrest-core-1.3.jar >X%%DATADIR%%/%%MAPRED_USER%%uce/lib/junit-4.11.jar >X%%DATADIR%%/tools/lib/aliyun-sdk-oss-2.8.3.jar >X%%DATADIR%%/tools/lib/aws-java-sdk-bundle-1.11.375.jar >X%%DATADIR%%/tools/lib/azure-data-lake-store-sdk-2.2.9.jar >X%%DATADIR%%/tools/lib/azure-keyvault-core-1.0.0.jar >X%%DATADIR%%/tools/lib/azure-storage-7.0.0.jar >X%%DATADIR%%/tools/lib/%%HADOOP_GROUP%%-aliyun-%%PORTVERSION%%.jar >X%%DATADIR%%/tools/lib/%%HADOOP_GROUP%%-archive-logs-%%PORTVERSION%%.jar >X%%DATADIR%%/tools/lib/%%HADOOP_GROUP%%-archives-%%PORTVERSION%%.jar >X%%DATADIR%%/tools/lib/%%HADOOP_GROUP%%-aws-%%PORTVERSION%%.jar >X%%DATADIR%%/tools/lib/%%HADOOP_GROUP%%-azure-%%PORTVERSION%%.jar >X%%DATADIR%%/tools/lib/%%HADOOP_GROUP%%-azure-datalake-%%PORTVERSION%%.jar >X%%DATADIR%%/tools/lib/%%HADOOP_GROUP%%-datajoin-%%PORTVERSION%%.jar >X%%DATADIR%%/tools/lib/%%HADOOP_GROUP%%-distcp-%%PORTVERSION%%.jar >X%%DATADIR%%/tools/lib/%%HADOOP_GROUP%%-extras-%%PORTVERSION%%.jar >X%%DATADIR%%/tools/lib/%%HADOOP_GROUP%%-fs2img-%%PORTVERSION%%.jar >X%%DATADIR%%/tools/lib/%%HADOOP_GROUP%%-gridmix-%%PORTVERSION%%.jar >X%%DATADIR%%/tools/lib/%%HADOOP_GROUP%%-kafka-%%PORTVERSION%%.jar >X%%DATADIR%%/tools/lib/%%HADOOP_GROUP%%-openstack-%%PORTVERSION%%.jar >X%%DATADIR%%/tools/lib/%%HADOOP_GROUP%%-resourceestimator-%%PORTVERSION%%.jar >X%%DATADIR%%/tools/lib/%%HADOOP_GROUP%%-rumen-%%PORTVERSION%%.jar >X%%DATADIR%%/tools/lib/%%HADOOP_GROUP%%-sls-%%PORTVERSION%%.jar >X%%DATADIR%%/tools/lib/%%HADOOP_GROUP%%-streaming-%%PORTVERSION%%.jar >X%%DATADIR%%/tools/lib/jdom-1.1.jar >X%%DATADIR%%/tools/lib/kafka-clients-0.8.2.1.jar >X%%DATADIR%%/tools/lib/lz4-1.2.0.jar >X%%DATADIR%%/tools/lib/ojalgo-43.0.jar >X%%DATADIR%%/tools/lib/wildfly-openssl-1.0.4.Final.jar >X%%DATADIR%%/tools/resourceestimator/bin/estimator.cmd >X%%DATADIR%%/tools/resourceestimator/bin/estimator.sh >X%%DATADIR%%/tools/resourceestimator/bin/start-estimator.cmd >X%%DATADIR%%/tools/resourceestimator/bin/start-estimator.sh >X%%DATADIR%%/tools/resourceestimator/bin/stop-estimator.cmd >X%%DATADIR%%/tools/resourceestimator/bin/stop-estimator.sh >X%%DATADIR%%/tools/resourceestimator/conf/resourceestimator-config.xml >X%%DATADIR%%/tools/resourceestimator/data/resourceEstimatorService.txt >X%%DATADIR%%/tools/sls/bin/rumen2sls.sh >X%%DATADIR%%/tools/sls/bin/slsrun.sh >X%%DATADIR%%/tools/sls/html/css/bootstrap-responsive.min.css >X%%DATADIR%%/tools/sls/html/css/bootstrap.min.css >X%%DATADIR%%/tools/sls/html/js/thirdparty/bootstrap.min.js >X%%DATADIR%%/tools/sls/html/js/thirdparty/d3-LICENSE >X%%DATADIR%%/tools/sls/html/js/thirdparty/d3.v3.js >X%%DATADIR%%/tools/sls/html/js/thirdparty/jquery.js >X%%DATADIR%%/tools/sls/html/showSimulationTrace.html >X%%DATADIR%%/tools/sls/html/simulate.html.template >X%%DATADIR%%/tools/sls/html/simulate.info.html.template >X%%DATADIR%%/tools/sls/html/track.html.template >X%%DATADIR%%/tools/sls/sample-conf/capacity-scheduler.xml >X%%DATADIR%%/tools/sls/sample-conf/fair-scheduler.xml >X%%DATADIR%%/tools/sls/sample-conf/log4j.properties >X%%DATADIR%%/tools/sls/sample-conf/sls-runner.xml >X%%DATADIR%%/tools/sls/sample-conf/yarn-site.xml >X%%DATADIR%%/tools/sls/sample-data/2jobs2min-rumen-jh.json >X%%DATADIR%%/yarn/%%HADOOP_GROUP%%-yarn-api-%%PORTVERSION%%.jar >X%%DATADIR%%/yarn/%%HADOOP_GROUP%%-yarn-applications-distributedshell-%%PORTVERSION%%.jar >X%%DATADIR%%/yarn/%%HADOOP_GROUP%%-yarn-applications-unmanaged-am-launcher-%%PORTVERSION%%.jar >X%%DATADIR%%/yarn/%%HADOOP_GROUP%%-yarn-client-%%PORTVERSION%%.jar >X%%DATADIR%%/yarn/%%HADOOP_GROUP%%-yarn-common-%%PORTVERSION%%.jar >X%%DATADIR%%/yarn/%%HADOOP_GROUP%%-yarn-registry-%%PORTVERSION%%.jar >X%%DATADIR%%/yarn/%%HADOOP_GROUP%%-yarn-server-applicationhistoryservice-%%PORTVERSION%%.jar >X%%DATADIR%%/yarn/%%HADOOP_GROUP%%-yarn-server-common-%%PORTVERSION%%.jar >X%%DATADIR%%/yarn/%%HADOOP_GROUP%%-yarn-server-nodemanager-%%PORTVERSION%%.jar >X%%DATADIR%%/yarn/%%HADOOP_GROUP%%-yarn-server-resourcemanager-%%PORTVERSION%%.jar >X%%DATADIR%%/yarn/%%HADOOP_GROUP%%-yarn-server-router-%%PORTVERSION%%.jar >X%%DATADIR%%/yarn/%%HADOOP_GROUP%%-yarn-server-sharedcachemanager-%%PORTVERSION%%.jar >X%%DATADIR%%/yarn/%%HADOOP_GROUP%%-yarn-server-tests-%%PORTVERSION%%.jar >X%%DATADIR%%/yarn/%%HADOOP_GROUP%%-yarn-server-timeline-pluginstorage-%%PORTVERSION%%.jar >X%%DATADIR%%/yarn/%%HADOOP_GROUP%%-yarn-server-web-proxy-%%PORTVERSION%%.jar >X%%DATADIR%%/yarn/%%HADOOP_GROUP%%-yarn-services-api-%%PORTVERSION%%.jar >X%%DATADIR%%/yarn/%%HADOOP_GROUP%%-yarn-services-core-%%PORTVERSION%%.jar >X%%DATADIR%%/yarn/%%HADOOP_GROUP%%-yarn-submarine-%%PORTVERSION%%.jar >X%%DATADIR%%/yarn/lib/HikariCP-java7-2.4.12.jar >X%%DATADIR%%/yarn/lib/aopalliance-1.0.jar >X%%DATADIR%%/yarn/lib/ehcache-3.3.1.jar >X%%DATADIR%%/yarn/lib/fst-2.50.jar >X%%DATADIR%%/yarn/lib/geronimo-jcache_1.0_spec-1.0-alpha-1.jar >X%%DATADIR%%/yarn/lib/guice-4.0.jar >X%%DATADIR%%/yarn/lib/guice-servlet-4.0.jar >X%%DATADIR%%/yarn/lib/jackson-jaxrs-base-2.9.5.jar >X%%DATADIR%%/yarn/lib/jackson-jaxrs-json-provider-2.9.5.jar >X%%DATADIR%%/yarn/lib/jackson-module-jaxb-annotations-2.9.5.jar >X%%DATADIR%%/yarn/lib/java-util-1.9.0.jar >X%%DATADIR%%/yarn/lib/javax.inject-1.jar >X%%DATADIR%%/yarn/lib/jersey-client-1.19.jar >X%%DATADIR%%/yarn/lib/jersey-guice-1.19.jar >X%%DATADIR%%/yarn/lib/json-io-2.5.1.jar >X%%DATADIR%%/yarn/lib/metrics-core-3.2.4.jar >X%%DATADIR%%/yarn/lib/mssql-jdbc-6.2.1.jre7.jar >X%%DATADIR%%/yarn/lib/objenesis-1.0.jar >X%%DATADIR%%/yarn/lib/snakeyaml-1.16.jar >X%%DATADIR%%/yarn/lib/swagger-annotations-1.5.4.jar >X%%DATADIR%%/yarn/test/%%HADOOP_GROUP%%-yarn-server-tests-%%PORTVERSION%%-tests.jar >X%%DATADIR%%/yarn/timelineservice/%%HADOOP_GROUP%%-yarn-server-timelineservice-%%PORTVERSION%%.jar >X%%DATADIR%%/yarn/timelineservice/%%HADOOP_GROUP%%-yarn-server-timelineservice-hbase-client-%%PORTVERSION%%.jar >X%%DATADIR%%/yarn/timelineservice/%%HADOOP_GROUP%%-yarn-server-timelineservice-hbase-common-%%PORTVERSION%%.jar >X%%DATADIR%%/yarn/timelineservice/%%HADOOP_GROUP%%-yarn-server-timelineservice-hbase-coprocessor-%%PORTVERSION%%.jar >X%%DATADIR%%/yarn/timelineservice/lib/commons-csv-1.0.jar >X%%DATADIR%%/yarn/timelineservice/lib/commons-lang-2.6.jar >X%%DATADIR%%/yarn/timelineservice/lib/hbase-annotations-1.2.6.jar >X%%DATADIR%%/yarn/timelineservice/lib/hbase-client-1.2.6.jar >X%%DATADIR%%/yarn/timelineservice/lib/hbase-common-1.2.6.jar >X%%DATADIR%%/yarn/timelineservice/lib/hbase-protocol-1.2.6.jar >X%%DATADIR%%/yarn/timelineservice/lib/htrace-core-3.1.0-incubating.jar >X%%DATADIR%%/yarn/timelineservice/lib/jcodings-1.0.13.jar >X%%DATADIR%%/yarn/timelineservice/lib/joni-2.1.2.jar >X%%DATADIR%%/yarn/timelineservice/lib/metrics-core-2.2.0.jar >X%%DATADIR%%/yarn/timelineservice/test/%%HADOOP_GROUP%%-yarn-server-timelineservice-hbase-tests-%%PORTVERSION%%.jar >X%%DATADIR%%/yarn/yarn-service-examples/httpd-no-dns/httpd-no-dns.json >X%%DATADIR%%/yarn/yarn-service-examples/httpd-no-dns/httpd-proxy-no-dns.conf >X%%DATADIR%%/yarn/yarn-service-examples/httpd/httpd-proxy.conf >X%%DATADIR%%/yarn/yarn-service-examples/httpd/httpd.json >X%%DATADIR%%/yarn/yarn-service-examples/sleeper/sleeper.json >X@dir(,%%HADOOP_GROUP%%,0775) %%HADOOP_LOGDIR%% >X@dir(,%%HADOOP_GROUP%%,0775) %%HADOOP_RUNDIR%% >2c0c13c964ec79c7d10ac17ff4d6682e >exit >
You cannot view the attachment while viewing its details because your browser does not support IFRAMEs.
View the attachment on a separate page
.
View Attachment As Raw
Actions:
View
Attachments on
bug 231048
:
196734
|
203866
|
203936
| 206104 |
236675
|
236676