FreeBSD Bugzilla – Attachment 147452 Details for
Bug 193706
[new port] devel/apache-spark: high performance distributed computing system
Home
|
New
|
Browse
|
Search
|
[?]
|
Reports
|
Help
|
New Account
|
Log In
Remember
[x]
|
Forgot Password
Login:
[x]
spark shar
spark.shar (text/plain), 12.43 KB, created by
Radim Kolar
on 2014-09-18 16:39:34 UTC
(
hide
)
Description:
spark shar
Filename:
MIME Type:
Creator:
Radim Kolar
Created:
2014-09-18 16:39:34 UTC
Size:
12.43 KB
patch
obsolete
># This is a shell archive. Save it in a file, remove anything before ># this line, and then unpack it by entering "sh file". Note, it may ># create directories; files and directories will be owned by you and ># have default permissions. ># ># This archive contains: ># ># spark ># spark/Makefile ># spark/pkg-descr ># spark/pkg-plist ># spark/distinfo ># spark/files ># spark/files/spark-master.in ># spark/files/spark-worker.in ># >echo c - spark >mkdir -p spark > /dev/null 2>&1 >echo x - spark/Makefile >sed 's/^X//' >spark/Makefile << 'a3d4a3aecd089b8d80746c432208ad00' >X# Created by: Radim Kolar <hsn@sendmail.cz> >X# $FreeBSD$ >X >XPORTNAME= spark >XPORTVERSION= 1.0.2 >XCATEGORIES= devel java >XMASTER_SITES= APACHE >XMASTER_SITE_SUBDIR= ${PORTNAME}/${PORTNAME}-${PORTVERSION} >XEXTRACT_SUFX= .tgz >X >XMAINTAINER= hsn@sendmail.cz >XCOMMENT= Fast big data processing engine >X >XLICENSE= APACHE20 >X >XBUILD_DEPENDS+= sbt:${PORTSDIR}/devel/sbt bash:${PORTSDIR}/shells/bash \ >X mvn:${PORTSDIR}/devel/maven3 >X >XRUN_DEPENDS+= bash:${PORTSDIR}/shells/bash >X >XUSES= python:2 >XUSE_RC_SUBR= spark-master spark-worker >XHADOOP_VERSION= 2.4.1 >XUSE_JAVA= yes >XJAVA_VERSION= 1.7 >XJAVA_RUN= yes >XNO_ARCH= yes >X >XBIN_LINKS= spark-submit spark-class spark-shell pyspark spark-example >X >Xpost-patch: >X.for f in ${BIN_LINKS} run-example >X if [ -f ${WRKSRC}/bin/${f} ]; then \ >X ${REINPLACE_CMD} -e 's|$$(cd `dirname $$0`/..; pwd)|$$(cd $$(dirname $$(readlink -f $$0))/..; pwd)|' ${WRKSRC}/bin/${f} ; \ >X fi >X.endfor >X >Xdo-build: >X cd ${WRKSRC} && SPARK_HADOOP_VERSION=${HADOOP_VERSION} SPARK_YARN=true ./sbt/sbt -mem 1024 assembly >X >Xdo-install: >X ${MKDIR} ${STAGEDIR}${DATADIR}/bin >X.for f in compute-classpath.sh load-spark-env.sh pyspark \ >X spark-class spark-shell spark-submit >X ${INSTALL_SCRIPT} ${WRKSRC}/bin/${f} ${STAGEDIR}${DATADIR}/bin >X.endfor >X ${INSTALL_SCRIPT} ${WRKSRC}/bin/run-example ${STAGEDIR}${DATADIR}/bin/spark-example >X ${MKDIR} ${STAGEDIR}${DATADIR}/conf >X ${INSTALL_DATA} ${WRKSRC}/conf/*.template ${STAGEDIR}${DATADIR}/conf >X ${INSTALL_SCRIPT} ${WRKSRC}/conf/spark-env.sh.template ${STAGEDIR}${DATADIR}/conf >X ${INSTALL_DATA} ${WRKSRC}/conf/slaves ${STAGEDIR}${DATADIR}/conf/slaves.sample >X ${MKDIR} ${STAGEDIR}${DATADIR}/lib >X ${INSTALL_DATA} ${WRKSRC}/lib_managed/jars/datanucleus*.jar ${STAGEDIR}${DATADIR}/lib >X ${INSTALL_DATA} ${WRKSRC}/assembly/target/scala-2.10/*.jar ${STAGEDIR}${DATADIR}/lib >X ${INSTALL_DATA} ${WRKSRC}/examples/target/scala-2.10/*.jar ${STAGEDIR}${DATADIR}/lib >X ${MKDIR} ${STAGEDIR}${DATADIR}/sbin >X ${INSTALL_SCRIPT} ${WRKSRC}/sbin/* ${STAGEDIR}${DATADIR}/sbin >X ${MKDIR} ${STAGEDIR}${DATADIR}/python >X cd ${WRKSRC}/python && ${COPYTREE_SHARE} . ${STAGEDIR}${DATADIR}/python >X ${INSTALL_SCRIPT} ${WRKSRC}/python/run-tests ${STAGEDIR}${DATADIR}/python >X.for f in CHANGES.txt LICENSE NOTICE README.md >X ${INSTALL_DATA} ${WRKSRC}/${f} ${STAGEDIR}${DATADIR} >X.endfor >X ${INSTALL_DATA} ${WRKSRC}/assembly/src/deb/RELEASE ${STAGEDIR}${DATADIR} >X.for f in ${BIN_LINKS} >X ${LN} -sf ${DATADIR}/bin/${f} ${STAGEDIR}${PREFIX}/bin/${f} >X.endfor >X >X.include <bsd.port.mk> >a3d4a3aecd089b8d80746c432208ad00 >echo x - spark/pkg-descr >sed 's/^X//' >spark/pkg-descr << 'e5e87b7be7dec59035073c07328664c0' >XApache Spark is a fast and general engine for large-scale data processing. >X >XSpark runs programs up to 100x faster than Hadoop MapReduce in memory, >Xor 10x faster on disk. >X >XSpark has an advanced DAG execution engine that supports cyclic data >Xflow and in-memory computing. >X >XYou can write applications quickly in Java, Scala or Python. >X >XSpark powers a stack of high-level tools including Spark SQL, MLlib >Xfor machine learning, GraphX, and Spark Streaming. You can combine these >Xframeworks seamlessly in the same application. >X >XIf you have a Hadoop 2 cluster, you can run Spark without any installation >Xneeded. Otherwise, Spark is easy to run standalone or on EC2 or Mesos. >XIt can read from HDFS, HBase, Cassandra, and any Hadoop data source. >X >XWWW: http://spark.apache.org/ >e5e87b7be7dec59035073c07328664c0 >echo x - spark/pkg-plist >sed 's/^X//' >spark/pkg-plist << 'ec8cb471685918b9ecc905200bf3f199' >Xbin/spark-submit >Xbin/spark-class >Xbin/spark-shell >Xbin/pyspark >Xbin/spark-example >X%%DATADIR%%/CHANGES.txt >X%%DATADIR%%/LICENSE >X%%DATADIR%%/NOTICE >X%%DATADIR%%/README.md >X%%DATADIR%%/RELEASE >X%%DATADIR%%/bin/compute-classpath.sh >X%%DATADIR%%/bin/load-spark-env.sh >X%%DATADIR%%/bin/pyspark >X%%DATADIR%%/bin/spark-example >X%%DATADIR%%/bin/spark-class >X%%DATADIR%%/bin/spark-shell >X%%DATADIR%%/bin/spark-submit >X%%DATADIR%%/conf/fairscheduler.xml.template >X%%DATADIR%%/conf/log4j.properties.template >X%%DATADIR%%/conf/metrics.properties.template >X@sample %%DATADIR%%/conf/slaves.sample >X%%DATADIR%%/conf/spark-defaults.conf.template >X%%DATADIR%%/conf/spark-env.sh.template >X%%DATADIR%%/lib/datanucleus-api-jdo-3.2.1.jar >X%%DATADIR%%/lib/datanucleus-core-3.2.2.jar >X%%DATADIR%%/lib/datanucleus-rdbms-3.2.1.jar >X%%DATADIR%%/lib/spark-assembly-1.0.2-hadoop2.4.1.jar >X%%DATADIR%%/lib/spark-examples-1.0.2-hadoop2.4.1.jar >X%%DATADIR%%/python/.gitignore >X%%DATADIR%%/python/epydoc.conf >X%%DATADIR%%/python/lib/PY4J_LICENSE.txt >X%%DATADIR%%/python/lib/py4j-0.8.1-src.zip >X%%DATADIR%%/python/pyspark/__init__.py >X%%DATADIR%%/python/pyspark/accumulators.py >X%%DATADIR%%/python/pyspark/broadcast.py >X%%DATADIR%%/python/pyspark/cloudpickle.py >X%%DATADIR%%/python/pyspark/conf.py >X%%DATADIR%%/python/pyspark/context.py >X%%DATADIR%%/python/pyspark/daemon.py >X%%DATADIR%%/python/pyspark/files.py >X%%DATADIR%%/python/pyspark/java_gateway.py >X%%DATADIR%%/python/pyspark/join.py >X%%DATADIR%%/python/pyspark/mllib/__init__.py >X%%DATADIR%%/python/pyspark/mllib/_common.py >X%%DATADIR%%/python/pyspark/mllib/classification.py >X%%DATADIR%%/python/pyspark/mllib/clustering.py >X%%DATADIR%%/python/pyspark/mllib/linalg.py >X%%DATADIR%%/python/pyspark/mllib/recommendation.py >X%%DATADIR%%/python/pyspark/mllib/regression.py >X%%DATADIR%%/python/pyspark/mllib/tests.py >X%%DATADIR%%/python/pyspark/mllib/util.py >X%%DATADIR%%/python/pyspark/rdd.py >X%%DATADIR%%/python/pyspark/rddsampler.py >X%%DATADIR%%/python/pyspark/resultiterable.py >X%%DATADIR%%/python/pyspark/serializers.py >X%%DATADIR%%/python/pyspark/shell.py >X%%DATADIR%%/python/pyspark/sql.py >X%%DATADIR%%/python/pyspark/statcounter.py >X%%DATADIR%%/python/pyspark/storagelevel.py >X%%DATADIR%%/python/pyspark/tests.py >X%%DATADIR%%/python/pyspark/worker.py >X%%DATADIR%%/python/run-tests >X%%DATADIR%%/python/test_support/hello.txt >X%%DATADIR%%/python/test_support/userlib-0.1-py2.7.egg >X%%DATADIR%%/python/test_support/userlibrary.py >X%%DATADIR%%/sbin/slaves.sh >X%%DATADIR%%/sbin/spark-config.sh >X%%DATADIR%%/sbin/spark-daemon.sh >X%%DATADIR%%/sbin/spark-daemons.sh >X%%DATADIR%%/sbin/spark-executor >X%%DATADIR%%/sbin/start-all.sh >X%%DATADIR%%/sbin/start-history-server.sh >X%%DATADIR%%/sbin/start-master.sh >X%%DATADIR%%/sbin/start-slave.sh >X%%DATADIR%%/sbin/start-slaves.sh >X%%DATADIR%%/sbin/stop-all.sh >X%%DATADIR%%/sbin/stop-history-server.sh >X%%DATADIR%%/sbin/stop-master.sh >X%%DATADIR%%/sbin/stop-slaves.sh >X@dirrmtry %%DATADIR%%/bin >X@dirrmtry %%DATADIR%%/conf >X@dirrmtry %%DATADIR%%/lib >X@dirrmtry %%DATADIR%%/python/lib >X@dirrmtry %%DATADIR%%/python/pyspark/mllib >X@dirrmtry %%DATADIR%%/python/pyspark >X@dirrmtry %%DATADIR%%/python/test_support >X@dirrmtry %%DATADIR%%/python >X@dirrmtry %%DATADIR%%/sbin >X@dirrmtry %%DATADIR%% >ec8cb471685918b9ecc905200bf3f199 >echo x - spark/distinfo >sed 's/^X//' >spark/distinfo << '586e348b9ae23d37d050410e0572ded3' >XSHA256 (spark-1.0.2.tgz) = 1e49ec151d8bf1808c84beca84007dc2a6c5eb17d588e4d813a8c9eea95b41d1 >XSIZE (spark-1.0.2.tgz) = 9060947 >586e348b9ae23d37d050410e0572ded3 >echo c - spark/files >mkdir -p spark/files > /dev/null 2>&1 >echo x - spark/files/spark-master.in >sed 's/^X//' >spark/files/spark-master.in << '8b6fbd1d16a6c3896e0da3b30529d9c6' >X#!/bin/sh >X# >X# Copyright (c) 2014, Radim Kolar >X# All rights reserved. >X# >X# Redistribution and use in source and binary forms, with or without >X# modification, are permitted provided that the following conditions are met: >X# >X# * Redistributions of source code must retain the above copyright notice, >X# this list of conditions and the following disclaimer. >X# * Redistributions in binary form must reproduce the above copyright >X# notice, this list of conditions and the following disclaimer in the >X# documentation and/or other materials provided with the distribution. >X# >X# THIS SOFTWARE IS PROVIDED BY THE AUTHOR AND CONTRIBUTORS ``AS IS'' AND ANY >X# EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED >X# WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE >X# DISCLAIMED. IN NO EVENT SHALL THE AUTHOR OR CONTRIBUTORS BE LIABLE FOR ANY >X# DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES >X# (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR >X# SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER >X# CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT >X# LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY >X# OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH >X# DAMAGE. >X >X# $FreeBSD$ >X# >X# PROVIDE: spark_master >X# REQUIRE: LOGIN >X# KEYWORD: shutdown >X# >X# Spark resource manager for standalone cluster >X >X. /etc/rc.subr >X >Xname=spark_master >Xrcvar=spark_master_enable >Xload_rc_config $name >X >X: ${spark_master_enable:=NO} >X: ${spark_master_port:=7077} >X: ${spark_master_ui_port:=8080} >X >Xexport SPARK_PID_DIR=/var/run >Xexport SPARK_LOG_DIR=/var/log/spark >Xexport SPARK_MASTER_PORT=${spark_master_port} >Xexport SPARK_MASTER_WEBUI_PORT=${spark_master_ui_port} >Xexport SPARK_IDENT_STRING=root >X >Xpidfile=/var/run/spark-root-org.apache.spark.deploy.master.Master-1.pid >Xstart_cmd=%%DATADIR%%/sbin/start-master.sh >Xstop_cmd=%%DATADIR%%/sbin/stop-master.sh >X >Xexport PATH=$PATH:%%LOCALBASE%%/bin >Xexport JAVA_VENDOR=openjdk >Xexport JAVA_VERSION=1.7 >X >Xrun_rc_command "$1" >8b6fbd1d16a6c3896e0da3b30529d9c6 >echo x - spark/files/spark-worker.in >sed 's/^X//' >spark/files/spark-worker.in << 'fa3e86b85a25554860531ee24459637c' >X#!/bin/sh >X# >X# Copyright (c) 2014, Radim Kolar >X# All rights reserved. >X# >X# Redistribution and use in source and binary forms, with or without >X# modification, are permitted provided that the following conditions are met: >X# >X# * Redistributions of source code must retain the above copyright notice, >X# this list of conditions and the following disclaimer. >X# * Redistributions in binary form must reproduce the above copyright >X# notice, this list of conditions and the following disclaimer in the >X# documentation and/or other materials provided with the distribution. >X# >X# THIS SOFTWARE IS PROVIDED BY THE AUTHOR AND CONTRIBUTORS ``AS IS'' AND ANY >X# EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED >X# WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE >X# DISCLAIMED. IN NO EVENT SHALL THE AUTHOR OR CONTRIBUTORS BE LIABLE FOR ANY >X# DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES >X# (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR >X# SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER >X# CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT >X# LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY >X# OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH >X# DAMAGE. >X >X# $FreeBSD$ >X# >X# PROVIDE: spark_worker >X# REQUIRE: LOGIN >X# KEYWORD: shutdown >X# >X# Spark worker on client node >X# optional config: spark_worker_cores, spark_worker_memory >X >X. /etc/rc.subr >X >Xname=spark_worker >Xrcvar=spark_worker_enable >Xload_rc_config $name >X >X: ${spark_worker_enable:=NO} >X: ${spark_worker_master:="spark://`hostname`:7077"} >X: ${spark_worker_ui_port:=8081} >X: ${spark_worker_wrkdir:="/tmp/spark/worker"} >X >Xexport SPARK_PID_DIR=/var/run >Xexport SPARK_LOG_DIR=/var/log/spark >X >Xpidfile=/var/run/spark-root-org.apache.spark.deploy.worker.Worker-1.pid >Xstart_cmd=worker_start >Xstop_cmd="%%DATADIR%%/sbin/spark-daemon.sh stop org.apache.spark.deploy.worker.Worker 1" >X >Xexport PATH=$PATH:%%LOCALBASE%%/bin >Xexport JAVA_VENDOR=openjdk >Xexport JAVA_VERSION=1.7 >X >Xworker_start () >X{ >X cmd="/usr/local/share/spark/sbin/start-slave.sh" >X args="-d ${spark_worker_wrkdir} --webui-port ${spark_worker_ui_port}" >X if [ "$spark_worker_cores" != "" ]; then >X args="$args -c $spark_worker_cores" >X fi >X >X if [ "$spark_worker_memory" != "" ]; then >X args="$args -m $spark_worker_memory" >X fi >X >X $cmd 1 $args ${spark_worker_master} >X} >X >Xrun_rc_command "$1" >fa3e86b85a25554860531ee24459637c >exit >
You cannot view the attachment while viewing its details because your browser does not support IFRAMEs.
View the attachment on a separate page
.
View Attachment As Raw
Actions:
View
Attachments on
bug 193706
:
147395
|
147452
|
147677
|
147883