java - Can't run Spark -
a few days ago extracted spark on machine (ubuntu) , made test run, seemed fine. today, think changed java paths, spark won't start.
instead following error message:
user@user:~/software/spark-1.1.0-bin-hadoop2.4$ ./bin/pyspark python 2.7.8 (default, oct 20 2014, 15:05:19) [gcc 4.9.1] on linux2 type "help", "copyright", "credits" or "license" more information. /home/user/software/spark-1.1.0-bin-hadoop2.4/bin/spark-class: line 180: /usr/lib/jvm/java-7-sun/bin/bin/java: no such file or directory traceback (most recent call last): file "/home/user/software/spark-1.1.0-bin-hadoop2.4/python/pyspark/shell.py", line 44, in <module> sc = sparkcontext(appname="pysparkshell", pyfiles=add_files) file "/home/user/software/spark-1.1.0-bin-hadoop2.4/python/pyspark/context.py", line 104, in __init__ sparkcontext._ensure_initialized(self, gateway=gateway) file "/home/user/software/spark-1.1.0-bin-hadoop2.4/python/pyspark/context.py", line 211, in _ensure_initialized sparkcontext._gateway = gateway or launch_gateway() file "/home/user/software/spark-1.1.0-bin-hadoop2.4/python/pyspark/java_gateway.py", line 71, in launch_gateway raise exception(error_msg) exception: launching gatewayserver failed exit code 127! warning: expected gatewayserver output port, found no output.
running java programs eclipse still works.
edit:
which java: /usr/bin/java javac -version: javac 1.7.0_65 echo $java_home: /usr/lib/jvm/java-7-sun/bin
your error messages includes path /usr/lib/jvm/java-7-sun/bin/bin/java
. notice duplicated bin
fragment.
bin
must not part of java_home
, set /usr/lib/jvm/java-7-sun/
.
Comments
Post a Comment