Chinaunix

标题: 启动spark失败 [打印本页]

作者: risepp    时间: 2015-10-16 10:45
标题: 启动spark失败
小弟初学spark,下载了spark-1.5.0-bin-hadoop2.6的bin介质,当启动pyspark的时候报错如下:
15/10/15 19:41:31 WARN MetricsSystem: Using default name DAGScheduler for source because spark.app.id is not set.
Traceback (most recent call last):
  File "/tmp/spark-1.5.0-bin-hadoop2.6/python/pyspark/shell.py", line 43, in <module>
    sc = SparkContext(pyFiles=add_files)
  File "/tmp/spark-1.5.0-bin-hadoop2.6/python/pyspark/context.py", line 113, in __init__
    conf, jsc, profiler_cls)
  File "/tmp/spark-1.5.0-bin-hadoop2.6/python/pyspark/context.py", line 174, in _do_init
    self._accumulatorServer = accumulators._start_update_server()
  File "/tmp/spark-1.5.0-bin-hadoop2.6/python/pyspark/accumulators.py", line 259, in _start_update_server
    server = AccumulatorServer(("localhost", 0), _UpdateRequestHandler)
  File "/usr/lib64/python2.6/SocketServer.py", line 412, in __init__
    self.server_bind()
  File "/usr/lib64/python2.6/SocketServer.py", line 423, in server_bind
    self.socket.bind(self.server_address)
  File "<string>", line 1, in bind
socket.gaierror: [Errno -2] Name or service not known

能否给帮忙看看是啥情况?本机安装的jdk是1.7,python是2.6的版本,os是redhat6.2
作者: wenhq    时间: 2015-10-29 19:16
配置问题吧。。
Name or Service not known。。。
作者: risepp    时间: 2015-11-04 23:29
是,我把/etc/hosts里面对localhost的映射加上就可以了




欢迎光临 Chinaunix (http://bbs.chinaunix.net/) Powered by Discuz! X3.2