I use FreeBSD 10.1-BETA1 #0 r271797 (amd64) When compiling spark (http://spark.apache.org) source code with openjdk7 it dumps core. How to reproduce: install latest openjdk7 from ports; install maven (3.0.5) from ports; get spark sources from git: git clone -b branch-1.1 https://github.com/apache/spark.git make sure you do not have custom MAVEN_OPTS environment variable set. cd spark mvn clean package -Dhadoop.version=2.4.1 -Pyarn -Phadoop-2.4 -Phadoop-provided -DskipTests shortly you will get the following output: [INFO] compiler plugin: BasicArtifact(org.scalamacros,paradise_2.10.4,2.0.1,null) [INFO] Compiling 109 Scala sources and 3 Java sources to /place/vartmp/spark/core/target/scala-2.10/test-classes... # # A fatal error has been detected by the Java Runtime Environment: # # SIGSEGV (0xb) at pc=0x0000000800da9984, pid=87156, tid=34384930816 # # JRE version: OpenJDK Runtime Environment (7.0-b17) (build 1.7.0_65-b17) # Java VM: OpenJDK 64-Bit Server VM (24.65-b04 mixed mode bsd-amd64 compressed oops) # Problematic frame: # C [libc.so.7+0x145984] unsigned long+0x44 # # Core dump written. Default location: /cores/core or core.87156 # # An error report file with more information is saved as: # /place/vartmp/spark/hs_err_pid87156.log #
For Java 6 and 7, please try increasing PermGen, i.e., env MAVEN_OPTS="-XX:MaxPermSize=128m" mvn clean package -Dhadoop.version=2.4.1 -Pyarn -Phadoop-2.4 -Phadoop-provided -DskipTests
(In reply to Jung-uk Kim from comment #1) > For Java 6 and 7, please try increasing PermGen, i.e., > > env MAVEN_OPTS="-XX:MaxPermSize=128m" mvn clean package > -Dhadoop.version=2.4.1 -Pyarn -Phadoop-2.4 -Phadoop-provided -DskipTests I know, it helps, but coredump is not an expected behavior without these options? The same excerpt from documentation claims that without these options you should get an error, not coredump.
(In reply to Dmitry Sivachenko from comment #2) > (In reply to Jung-uk Kim from comment #1) > > For Java 6 and 7, please try increasing PermGen, i.e., > > > > env MAVEN_OPTS="-XX:MaxPermSize=128m" mvn clean package > > -Dhadoop.version=2.4.1 -Pyarn -Phadoop-2.4 -Phadoop-provided -DskipTests > > I know, it helps, but coredump is not an expected behavior without these > options? > > The same excerpt from documentation claims that without these options you > should get an error, not coredump. Correct. In fact, I got the errors with Java 6 and 7 and recommended the workaround. Basically, I was not able to reproduce the problem on head.
Hmm... with fresh git clone I am always get coredump on stable/10 ...
Well, I recompiled openjdk from ports again and now it does not dump core but rather prints the expected error. Before that I had the very same version of openjdk installed (openjdk-7.65.17,1), but it was compiled under older snapshot of stable/10. Should I recompile openjdk after each OS update?
You shouldn't generally need to compile openjdk after each OS update. It looks like in this case there was an associated problem but that is rare. I'm going to close this though since it doesn't seem to be a continuing problem and there is little that can be done on the openjdk front to prevent it.