Bug 231048 - [new port] devel/hadoop3: Apache Map/Reduce Framework v3.2
Summary: [new port] devel/hadoop3: Apache Map/Reduce Framework v3.2
Status: New
Alias: None
Product: Ports & Packages
Classification: Unclassified
Component: Individual Port(s) (show other bugs)
Version: Latest
Hardware: Any Any
: --- Affects Only Me
Assignee: freebsd-ports-bugs (Nobody)
URL:
Keywords:
Depends on:
Blocks: 237481
  Show dependency treegraph
 
Reported: 2018-08-31 05:17 UTC by Johannes Jost Meixner
Modified: 2023-08-23 05:41 UTC (History)
7 users (show)

See Also:


Attachments
hadoop3 shar (50.56 KB, text/plain)
2018-08-31 05:17 UTC, Johannes Jost Meixner
no flags Details
Hadoop 3.2 shell archive (121.41 KB, text/plain)
2019-04-21 15:34 UTC, Johannes Jost Meixner
no flags Details
CRF on hadoop 3.2 (107.73 KB, text/plain)
2019-04-23 16:45 UTC, Johannes Jost Meixner
no flags Details
CRF for hadoop 3.2 (130.53 KB, text/plain)
2019-07-27 20:20 UTC, Johannes Jost Meixner
no flags Details
hadoop3v2.shar (107.62 KB, application/x-shellscript)
2022-09-18 21:23 UTC, Martin Filla
no flags Details
hadoop3v2.shar (107.65 KB, application/x-shellscript)
2022-09-18 21:31 UTC, Martin Filla
no flags Details

Note You need to log in before you can comment on or make changes to this bug.
Description Johannes Jost Meixner freebsd_committer freebsd_triage 2018-08-31 05:17:56 UTC
Created attachment 196734 [details]
hadoop3 shar

SHAR attached adds a preliminary variant devel/hadoop3 to the portstree.

What isn't done yet:
* porting over setsid/ssid patches from hadoop2
* rc.d scripts need a lot of work
* etc/ samples need a lot of work
* testing actual map/reduce through YARN & friends
* lots of polishing, testing

What's done:
* testing that namenode + datanode can play nice together
* the package builds... to some degree

What would be a great idea before committing:

* renaming hadoop to hadoop1
* renaming hadoop3 to hadoop
* ensuring dependencies match

I am hoping to do a lot of these items within the next month(s) but would appreciate any help/patches/feedback given.
Comment 1 Johannes Jost Meixner freebsd_committer freebsd_triage 2018-08-31 05:22:50 UTC
Two more things:

* the work/m2 directory needs to be hosted somewhere else, and DISTFILES :maven should  adjusted
* once that's done, the maven call in do-build should have `--offline` appended
Comment 2 Kurt Jaeger freebsd_committer freebsd_triage 2018-08-31 17:46:04 UTC
I wasn't aware that it's possible to create a shar file that does not
create the directories required ?

[...]
x - hadoop3/files/zkfc.in
/tmp/had: cannot create hadoop3/files/zkfc.in: No such file or directory

Hmm. Any idea ?
Comment 3 Kurt Jaeger freebsd_committer freebsd_triage 2018-08-31 18:08:39 UTC
Ok, I can cope with mkdir in small numbers 8-)

testbuild@work after I try to silence portlint by butching the makefile 8-}
Comment 4 Kurt Jaeger freebsd_committer freebsd_triage 2018-09-01 08:00:07 UTC
testbuild failed on 11.1a, see

http://people.freebsd.org/~pi/logs/devel__hadoop3-111.txt
Comment 5 Kurt Jaeger freebsd_committer freebsd_triage 2018-09-01 11:54:47 UTC
Fails with the same problem on cur:

http://people.freebsd.org/~pi/logs/devel__hadoop3-cur.txt
Comment 6 Kurt Jaeger freebsd_committer freebsd_triage 2018-09-03 19:04:07 UTC
Port is still being worked on, ETA end of October.
Comment 7 Johannes Jost Meixner freebsd_committer freebsd_triage 2019-04-21 15:34:20 UTC
Created attachment 203866 [details]
Hadoop 3.2 shell archive

Bump to Hadoop 3.2 

- builds fine (for me, in poudriere testport)
- after adding configuration [1], namenode and datanode can be started
- I've carried over the sedsid->ssid patches from devel/hadoop2
- I've had to import and extend https://jira.apache.org/jira/browse/MAPREDUCE-6417

[1] https://hadoop.apache.org/docs/r3.2.0/hadoop-project-dist/hadoop-common/SingleCluster.html
Comment 8 Kurt Jaeger freebsd_committer freebsd_triage 2019-04-23 05:35:40 UTC
portlint has some comments. One of them is that patch filenames longer
than 100 characters are FATAL. I have no idea on how to handle this.

testbuilds@work.
Comment 9 Kurt Jaeger freebsd_committer freebsd_triage 2019-04-23 15:46:27 UTC
Build fails on 11.2amd64, but as it's a new port, I guess that can be tolerated.

https://people.freebsd.org/~pi/logs/devel__hadoop3-112-1555997211.txt

Build fails on current-i386, but I guess that's also expected.

I'll ask portmgr about the overlong filenames.
Comment 10 Tobias Kortkamp freebsd_committer freebsd_triage 2019-04-23 15:59:08 UTC
(In reply to Johannes Jost Meixner from comment #7)
> - I've carried over the sedsid->ssid patches from devel/hadoop2

sysutils/ssid installs as setsid too since ports r482969, so this seems
unnecessary and some of the patches could be dropped.
Comment 11 Johannes Jost Meixner freebsd_committer freebsd_triage 2019-04-23 16:45:10 UTC
Created attachment 203936 [details]
CRF on hadoop 3.2

- remove setsid/ssid patches
- fix @dir in pkg-plist / user:group info
- remove bogus .orig files from pkg-plist

Once I get a working arcanist setup on the buildbox I will close this PR and move everything to Phabricator. Might make reviewing / commenting things a lot easier - especially given that there are still a few renames and new ports forthcoming.

Thanks tobik@ and lwhsu@ for the feedback!
Comment 12 Johannes Jost Meixner freebsd_committer freebsd_triage 2019-04-23 16:46:58 UTC
(In reply to Kurt Jaeger from comment #9)
I'm guessing the build failure is due to different libc versions.

I removed the USE_GCC line so it would build with Clang. Might have to add some
USES+= compiler
magic!
Comment 13 Kurt Jaeger freebsd_committer freebsd_triage 2019-04-23 17:30:59 UTC
testbuilds@work
Comment 14 Kurt Jaeger freebsd_committer freebsd_triage 2019-04-28 19:44:41 UTC
builds on current, test-build on 12 @work
Comment 15 Kurt Jaeger freebsd_committer freebsd_triage 2019-04-28 19:52:56 UTC
Builds on 12.0, fails to build on 11.2:

https://people.freebsd.org/~pi/logs/devel__hadoop3-112.txt

Is it supposed to build on 11.2 or is that an expected failure ?
Comment 16 Johannes Jost Meixner freebsd_committer freebsd_triage 2019-05-13 02:31:21 UTC
(In reply to Kurt Jaeger from comment #15)
So far I've really only tested it on 13.0-CURRENT, as that's what my buildbox runs.
Comment 17 Yuri Victorovich freebsd_committer freebsd_triage 2019-06-12 16:03:53 UTC
On 12amd64 there's this plist error:
> ===> Checking for items in STAGEDIR missing from pkg-plist
> Error: Orphaned: @dir %%HADOOP_LOGDIR%%
> ===> Checking for items in pkg-plist which are not in STAGEDIR
>Error: Missing: @dir %%HADOOP_LOGDIR%%
Comment 18 Johannes Jost Meixner freebsd_committer freebsd_triage 2019-07-27 20:20:32 UTC
Created attachment 206104 [details]
CRF for hadoop 3.2

* improved DEPENDS: added LIB_DEPENDS on protobuf25
* IGNORE on FreeBSD 11: I can't figure out how to make C++'s getline definition (istream) play nice with C's getline definition (stdio.h) 
* Build tested w/ poudriere testport on 12-STABLE and 13-CURRENT
Comment 19 Yuri Victorovich freebsd_committer freebsd_triage 2019-07-27 22:10:09 UTC
(In reply to Johannes Jost Meixner from comment #18)

It fails for me:
[WARNING] cd /usr/ports/devel/hadoop3/work/hadoop-3.2.0-src/hadoop-hdfs-project/hadoop-hdfs-native-client/target/main/native/libhdfs && /usr/bin/cc -DLIBHDFS_DLL_EXPORT -I/usr/ports/devel/hadoop3/work/hadoop-3.2.0-src/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfs/include -I/usr/ports/devel/hadoop3/work/hadoop-3.2.0-src/hadoop-hdfs-project/hadoop-hdfs-native-client/target/native/javah -I/usr/ports/devel/hadoop3/work/hadoop-3.2.0-src/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfs -I/usr/ports/devel/hadoop3/work/hadoop-3.2.0-src/hadoop-hdfs-project/hadoop-hdfs-native-client/target -I/usr/local/openjdk8/include -I/usr/local/openjdk8/include/freebsd -I/usr/ports/devel/hadoop3/work/hadoop-3.2.0-src/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfs/main/native -I/usr/ports/devel/hadoop3/work/hadoop-3.2.0-src/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfs/main/native/libhdfs -I/usr/ports/devel/hadoop3/work/hadoop-3.2.0-src/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfs/os/posix -O2 -pipe -fno-omit-frame-pointer  -fstack-protector-strong -fno-strict-aliasing -g -O2 -Wall -pthread -D_FILE_OFFSET_BITS=64 -fvisibility=hidden -o CMakeFiles/test_libhdfs_threaded_hdfs_static.dir/os/posix/thread.c.o   -c /usr/ports/devel/hadoop3/work/hadoop-3.2.0-src/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfs/os/posix/thread.c
[WARNING] --- main/native/libhdfspp/lib/proto/CMakeFiles/protoc-gen-hrpc.dir/all ---
[WARNING] ld: error: undefined symbol: google::protobuf::compiler::CodeGenerator::GenerateAll(std::__1::vector<google::protobuf::FileDescriptor const*, std::__1::allocator<google::protobuf::FileDescriptor const*> > const&, std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > const&, google::protobuf::compiler::GeneratorContext*, std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> >*) const
[WARNING] >>> referenced by protoc_gen_hrpc.cc
[WARNING] >>>               CMakeFiles/protoc-gen-hrpc.dir/protoc_gen_hrpc.cc.o:(vtable for StubGenerator)
[WARNING] c++: error: linker command failed with exit code 1 (use -v to see invocation)
[WARNING] *** [main/native/libhdfspp/lib/proto/protoc-gen-hrpc] Error code 1
[WARNING] 
[WARNING] make[4]: stopped in /usr/ports/devel/hadoop3/work/hadoop-3.2.0-src/hadoop-hdfs-project/hadoop-hdfs-native-client/target
[WARNING] 1 error
[WARNING]
Comment 20 Johannes Jost Meixner freebsd_committer freebsd_triage 2019-07-28 09:37:33 UTC
(In reply to Yuri Victorovich from comment #19)
Which os release is this, which linker are you using (ld.lld or ld.bfd) and is this in poudriere?
Comment 21 Yuri Victorovich freebsd_committer freebsd_triage 2019-07-28 09:50:28 UTC
(In reply to Johannes Jost Meixner from comment #20)

FreeBSD 12.0-STABLE r347548

LLD 8.0.0 (FreeBSD 356365-1200007) (compatible with GNU linkers)

No poudriere.
Comment 22 Johannes Jost Meixner freebsd_committer freebsd_triage 2019-07-28 09:58:14 UTC
I tested with 12-stable in poudriere, worked fine...

Ideas:
* Do you have the latest version of protobuf25 installed? In 2.5.0_5, antoine@ fixed it up yesterday and changed the LDCONFIG entry so it gets cached automatically. 

* I've seen this error countless times with rebuilds myself; did you have a clean tree (no work/ dir) before running `make` ?

You could try these two and build it again, I used `make clean package` many many times to get there.
Comment 23 Yuri Victorovich freebsd_committer freebsd_triage 2019-07-28 10:33:36 UTC
(In reply to Johannes Jost Meixner from comment #22)

I just upgraded to protobuf25-2.5.0_5 and my build tree is clean, but it still fails.
Comment 24 Yuri Victorovich freebsd_committer freebsd_triage 2019-07-28 10:51:36 UTC
It also failed in poudriere in check-plist: https://people.freebsd.org/~yuri/hadoop3-3.2.0.log
Comment 25 Johannes Jost Meixner freebsd_committer freebsd_triage 2019-07-28 14:22:26 UTC
(In reply to Yuri Victorovich from comment #24)
Your log contains 581 "Orphaned:" entries.
My pkg-plist is 581 lines long. 

Are you sure you grabbed the latest copy? ;-)
Comment 26 Johannes Jost Meixner freebsd_committer freebsd_triage 2019-08-06 18:58:01 UTC
(In reply to Yuri Victorovich from comment #24)
Were you able to test this with a fully-extracted SHAR including pkg-plist?

Thanks,
Johannes
Comment 27 Mårten Lindblad 2021-02-08 21:28:57 UTC
Same issue here with protobuf:
[WARNING] --- main/native/libhdfspp/lib/proto/CMakeFiles/protoc-gen-hrpc.dir/all ---
[WARNING] /usr/local/bin/ld: CMakeFiles/protoc-gen-hrpc.dir/protoc_gen_hrpc.cc.o: in function `StubGenerator::EmitMethod(google::protobuf::MethodDescriptor const*, google::protobuf::io::Printer*) const':
[WARNING] /root/hej/hadoop3/work/hadoop-3.2.0-src/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/lib/proto/protoc_gen_hrpc.cc:83: undefined reference to `google::protobuf::io::Printer::Print(char const*, char const*, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, char const*, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&)'
[WARNING] /usr/local/bin/ld: CMakeFiles/protoc-gen-hrpc.dir/protoc_gen_hrpc.cc.o: in function `StubGenerator::EmitService(google::protobuf::ServiceDescriptor const*, google::protobuf::io::Printer*) const':
[WARNING] /root/hej/hadoop3/work/hadoop-3.2.0-src/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/lib/proto/protoc_gen_hrpc.cc:64: undefined reference to `google::protobuf::io::Printer::Print(char const*, char const*, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&)'
[WARNING] collect2: error: ld returned 1 exit status

protobuf version installed via pkg:
Name           : protobuf25
Version        : 2.5.0_5
Comment 28 Mårten Lindblad 2021-02-08 21:34:25 UTC
The version (3.4-SNAPSHOT) in trunk is targeting an upgrade of protobuf: https://issues.apache.org/jira/browse/HADOOP-13363 that might help with all the protobuf issues.
Comment 29 Mårten Lindblad 2021-02-16 07:54:14 UTC
(In reply to Mårten Lindblad from comment #28)
It worked with a new jail so I think the errors come from both having protobuf and protobuf25 installed.
Comment 30 Rene Ladan freebsd_committer freebsd_triage 2022-03-31 20:44:23 UTC
building this with poudriere bulk -t in a 13.0-amd64 jail now.
Comment 31 Rene Ladan freebsd_committer freebsd_triage 2022-03-31 20:44:52 UTC
00:00:07] Warning: (devel/hadoop3): Error: devel/hadoop3 depends on nonexistent origin 'devel/maven33' (port EXPIRED); Please contact maintainer of the port to fix this.
[00:00:07] Warning: (devel/hadoop3): Error: devel/hadoop3 depends on nonexistent origin 'sysutils/ssid' (moved to sysutils/setsid); Please contact maintainer of the port to fix this.
Comment 32 Martin Filla 2022-09-18 21:23:29 UTC
Created attachment 236675 [details]
hadoop3v2.shar

Hi,
this my version Hadoop 3.2.0.
Fixed problem with old version maven and tested with poudriere without issues

=======================<phase: package        >============================
===>  Building package for hadoop3-3.2.0
===========================================================================
=>> Recording filesystem state for preinst... done
=======================<phase: install        >============================
===>  Installing for hadoop3-3.2.0
===>  Checking if hadoop3 is already installed
===>   Registering installation for hadoop3-3.2.0
[freebsd13x64-default] Installing hadoop3-3.2.0...
===> Creating groups.
Creating group 'hadoop' with gid '955'.
===> Creating users
Creating user 'hdfs' with uid '955'.
Creating user 'mapred' with uid '947'.
===> SECURITY REPORT: 
      This port has installed the following files which may act as network
      servers and may therefore pose a remote security risk to the system.
/usr/local/lib/libhadoop.a(DomainSocket.c.o)
/usr/local/lib/libhadoop.so.1.0.0

      If there are vulnerabilities in these programs there may be a security
      risk to the system. FreeBSD makes no guarantee about the security of
      ports included in the Ports Collection. Please type 'make deinstall'
      to deinstall the port if this is a concern.

      For more information, and contact details about the security
      status of this software, see the following webpage: 
http://hadoop.apache.org/
===========================================================================
=>> Checking shared library dependencies
 0x0000000000000001 NEEDED               Shared library: [libc++.so.1]
 0x0000000000000001 NEEDED               Shared library: [libc.so.7]
 0x0000000000000001 NEEDED               Shared library: [libcrypto.so.111]
 0x0000000000000001 NEEDED               Shared library: [libcxxrt.so.1]
 0x0000000000000001 NEEDED               Shared library: [libdl.so.1]
 0x0000000000000001 NEEDED               Shared library: [libexecinfo.so.1]
 0x0000000000000001 NEEDED               Shared library: [libgcc_s.so.1]
 0x0000000000000001 NEEDED               Shared library: [libjvm.so]
 0x0000000000000001 NEEDED               Shared library: [libm.so.5]
 0x0000000000000001 NEEDED               Shared library: [libprotobuf.so.8]
 0x0000000000000001 NEEDED               Shared library: [librt.so.1]
 0x0000000000000001 NEEDED               Shared library: [libsasl2.so.3]
 0x0000000000000001 NEEDED               Shared library: [libsnappy.so.1]
 0x0000000000000001 NEEDED               Shared library: [libssl.so.111]
 0x0000000000000001 NEEDED               Shared library: [libthr.so.3]
 0x0000000000000001 NEEDED               Shared library: [libz.so.6]
=======================<phase: deinstall      >============================
===>  Deinstalling for hadoop3
===>   Deinstalling hadoop3-3.2.0
Updating database digests format: .......... done
Checking integrity... done (0 conflicting)
Deinstallation has been requested for the following 1 packages (of 0 packages in the universe):

Installed packages to be REMOVED:
	hadoop3: 3.2.0

Number of packages to be removed: 1

The operation will free 380 MiB.
[freebsd13x64-default] [1/1] Deinstalling hadoop3-3.2.0...
[freebsd13x64-default] [1/1] Deleting files for hadoop3-3.2.0: .......... done
==> You should manually remove the "hdfs" user. 
==> You should manually remove the "mapred" user. 
==> You should manually remove the "hadoop" group 
===========================================================================
=>> Checking for extra files and directories
[00:21:14] Installing from package
[freebsd13x64-default] Installing hadoop3-3.2.0...
===> Creating groups.
Using existing group 'hadoop'.
===> Creating users
Using existing user 'hdfs'.
Using existing user 'mapred'.
[freebsd13x64-default] Extracting hadoop3-3.2.0: .......... done
[00:21:34] Cleaning up
===>  Cleaning for hadoop3-3.2.0
[00:21:43] Deinstalling package
Updating database digests format: . done
Checking integrity... done (0 conflicting)
Deinstallation has been requested for the following 1 packages (of 0 packages in the universe):

Installed packages to be REMOVED:
	hadoop3: 3.2.0

Number of packages to be removed: 1

The operation will free 380 MiB.
[freebsd13x64-default] [1/1] Deinstalling hadoop3-3.2.0...
[freebsd13x64-default] [1/1] Deleting files for hadoop3-3.2.0: .......... done
==> You should manually remove the "hdfs" user. 
==> You should manually remove the "mapred" user. 
==> You should manually remove the "hadoop" group 
build of devel/hadoop | hadoop3-3.2.0 ended at Sun Sep 18 23:10:47 CEST 2022
build time: 00:21:11
[00:21:44] Logs: /usr/local/poudriere/data/logs/bulk/freebsd13x64-default/2022-09-18_22h49m03s
[00:21:44] Cleaning up
[00:21:44] Unmounting file systems
Comment 33 Martin Filla 2022-09-18 21:31:59 UTC
Created attachment 236676 [details]
hadoop3v2.shar
Comment 34 Martin Filla 2022-09-20 17:02:30 UTC
(In reply to Martin Filla from comment #33)
review: 
https://reviews.freebsd.org/D36639
Comment 35 Daniel Engberg freebsd_committer freebsd_triage 2022-09-21 05:23:47 UTC
This needs at least do be upgraded to 3.2.4 as 3.2.0 is already unsupported upstream.

I would however suggest updating it to 3.3.4 and submitting patches upstream as this is going to bitrot quicky in tree due the amount of patches.
Comment 36 Martin Filla 2022-09-21 19:29:02 UTC
(In reply to Daniel Engberg from comment #35)
Hi
I tried update to 3.2.4 and I have an issue with protobuf package


[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time:  9.057 s
[INFO] Finished at: 2022-09-21T21:18:33+02:00
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.xolstice.maven.plugins:protobuf-maven-plugin:0.5.1:compile (src-compile-protoc) on project hadoop-common: Missing:
[ERROR] ----------
[ERROR] 1) com.google.protobuf:protoc:exe:freebsd-x86_64:2.5.0
[ERROR] 
[ERROR]   Try downloading the file manually from the project website.
[ERROR] 
[ERROR]   Then, install it using the command: 
[ERROR]       mvn install:install-file -DgroupId=com.google.protobuf -DartifactId=protoc -Dversion=2.5.0 -Dclassifier=freebsd-x86_64 -Dpackaging=exe -Dfile=/path/to/file
[ERROR] 
[ERROR]   Alternatively, if you host your own repository you can deploy the file there: 
[ERROR]       mvn deploy:deploy-file -DgroupId=com.google.protobuf -DartifactId=protoc -Dversion=2.5.0 -Dclassifier=freebsd-x86_64 -Dpackaging=exe -Dfile=/path/to/file -Durl=[url] -DrepositoryId=[id]
[ERROR] 
[ERROR]   Path to dependency: 
[ERROR]   	1) org.apache.hadoop:hadoop-common:jar:3.2.4
[ERROR]   	2) com.google.protobuf:protoc:exe:freebsd-x86_64:2.5.0
[ERROR] 
[ERROR] ----------
[ERROR] 1 required artifact is missing.
[ERROR] 
[ERROR] for artifact: 
[ERROR]   org.apache.hadoop:hadoop-common:jar:3.2.4
[ERROR] 
[ERROR] from the specified remote repositories:
[ERROR]   apache.snapshots.https (https://repository.apache.org/content/repositories/snapshots, releases=true, snapshots=true),
[ERROR]   repository.jboss.org (https://repository.jboss.org/nexus/content/groups/public/, releases=true, snapshots=false),
[ERROR]   central (https://repo.maven.apache.org/maven2, releases=true, snapshots=false)
[ERROR] 
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <args> -rf :hadoop-common
Comment 37 Mark Linimon freebsd_committer freebsd_triage 2023-08-23 05:41:18 UTC
Canonicalize assignment.