免费注册 查看新帖 |

Chinaunix

  平台 论坛 博客 文库
最近访问板块 发新帖
查看: 6272 | 回复: 0

[Spark] spark run-example跑完后没结果 [复制链接]

论坛徽章:
0
发表于 2015-06-18 17:06 |显示全部楼层
执行完/usr/local/spark/bin/run-example org.apache.spark.examples.SparkPi local 没有返回结果

>$ /usr/local/spark/bin/run-example org.apache.spark.examples.SparkPi local
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
15/06/18 16:39:04 INFO SparkContext: Running Spark version 1.4.0
15/06/18 16:39:05 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
15/06/18 16:39:05 INFO SecurityManager: Changing view acls to: hadoop
15/06/18 16:39:05 INFO SecurityManager: Changing modify acls to: hadoop
15/06/18 16:39:05 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(hadoop); users with modify permissions: Set(hadoop)
15/06/18 16:39:05 INFO Slf4jLogger: Slf4jLogger started
15/06/18 16:39:05 INFO Remoting: Starting remoting
15/06/18 16:39:06 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriver@192.168.7.244:36288]
15/06/18 16:39:06 INFO Utils: Successfully started service 'sparkDriver' on port 36288.
15/06/18 16:39:06 INFO SparkEnv: Registering MapOutputTracker
15/06/18 16:39:06 INFO SparkEnv: Registering BlockManagerMaster
15/06/18 16:39:06 INFO DiskBlockManager: Created local directory at /tmp/spark-c7749c63-d53e-4bd8-a377-ddadbfdeeed3/blockmgr-851c4a8c-8f89-4c22-b81f-9fe095e941d8
15/06/18 16:39:06 INFO MemoryStore: MemoryStore started with capacity 265.1 MB
15/06/18 16:39:06 INFO HttpFileServer: HTTP File server directory is /tmp/spark-c7749c63-d53e-4bd8-a377-ddadbfdeeed3/httpd-073fa9e8-2646-4fc9-a8d7-9dcb1511c029
15/06/18 16:39:06 INFO HttpServer: Starting HTTP Server
15/06/18 16:39:06 INFO Utils: Successfully started service 'HTTP file server' on port 35185.
15/06/18 16:39:06 INFO SparkEnv: Registering OutputCommitCoordinator
15/06/18 16:39:06 INFO Utils: Successfully started service 'SparkUI' on port 4040.
15/06/18 16:39:06 INFO SparkUI: Started SparkUI at http://192.168.7.244:4040
15/06/18 16:39:06 INFO SparkContext: Added JAR file:/usr/local/spark/lib/spark-examples-1.4.0-hadoop2.6.0.jar at http://192.168.7.244:35185/jars/ ... 4.0-hadoop2.6.0.jar with timestamp 1434616746886
15/06/18 16:39:06 INFO AppClient$ClientActor: Connecting to master akka.tcp://sparkMaster@192.168.7.244:7077/user/Master...
15/06/18 16:39:07 INFO SparkDeploySchedulerBackend: Connected to Spark cluster with app ID app-20150618163907-0005
15/06/18 16:39:07 INFO AppClient$ClientActor: Executor added: app-20150618163907-0005/0 on worker-20150618145843-192.168.7.247-54347 (192.168.7.247:54347) with 2 cores
15/06/18 16:39:07 INFO SparkDeploySchedulerBackend: Granted executor ID app-20150618163907-0005/0 on hostPort 192.168.7.247:54347 with 2 cores, 512.0 MB RAM
15/06/18 16:39:07 INFO AppClient$ClientActor: Executor added: app-20150618163907-0005/1 on worker-20150618145843-192.168.7.237-55205 (192.168.7.237:55205) with 2 cores
15/06/18 16:39:07 INFO SparkDeploySchedulerBackend: Granted executor ID app-20150618163907-0005/1 on hostPort 192.168.7.237:55205 with 2 cores, 512.0 MB RAM
15/06/18 16:39:07 INFO AppClient$ClientActor: Executor added: app-20150618163907-0005/2 on worker-20150618145840-192.168.7.246-40048 (192.168.7.246:4004 with 2 cores
15/06/18 16:39:07 INFO SparkDeploySchedulerBackend: Granted executor ID app-20150618163907-0005/2 on hostPort 192.168.7.246:40048 with 2 cores, 512.0 MB RAM
15/06/18 16:39:07 INFO AppClient$ClientActor: Executor added: app-20150618163907-0005/3 on worker-20150618145840-192.168.7.232-52496 (192.168.7.232:52496) with 2 cores
15/06/18 16:39:07 INFO SparkDeploySchedulerBackend: Granted executor ID app-20150618163907-0005/3 on hostPort 192.168.7.232:52496 with 2 cores, 512.0 MB RAM
15/06/18 16:39:07 INFO AppClient$ClientActor: Executor updated: app-20150618163907-0005/2 is now LOADING
15/06/18 16:39:07 INFO AppClient$ClientActor: Executor updated: app-20150618163907-0005/3 is now LOADING
15/06/18 16:39:07 INFO AppClient$ClientActor: Executor updated: app-20150618163907-0005/0 is now LOADING
15/06/18 16:39:07 INFO AppClient$ClientActor: Executor updated: app-20150618163907-0005/1 is now LOADING
15/06/18 16:39:07 INFO AppClient$ClientActor: Executor updated: app-20150618163907-0005/0 is now RUNNING
5/06/18 16:39:07 INFO AppClient$ClientActor: Executor updated: app-20150618163907-0005/1 is now RUNNING
15/06/18 16:39:07 INFO AppClient$ClientActor: Executor updated: app-20150618163907-0005/2 is now RUNNING
15/06/18 16:39:07 INFO AppClient$ClientActor: Executor updated: app-20150618163907-0005/3 is now RUNNING
15/06/18 16:39:07 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 44921.
15/06/18 16:39:07 INFO NettyBlockTransferService: Server created on 44921
15/06/18 16:39:07 INFO BlockManagerMaster: Trying to register BlockManager
15/06/18 16:39:07 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.7.244:44921 with 265.1 MB RAM, BlockManagerId(driver, 192.168.7.244, 44921)
15/06/18 16:39:07 INFO BlockManagerMaster: Registered BlockManager
15/06/18 16:39:07 INFO SparkDeploySchedulerBackend: SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.0
Exception in thread "main" java.lang.NumberFormatException: For input string: "local"
        at java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
        at java.lang.Integer.parseInt(Integer.java:580)
        at java.lang.Integer.parseInt(Integer.java:615)
        at scala.collection.immutable.StringLike$class.toInt(StringLike.scala:229)
        at scala.collection.immutable.StringOps.toInt(StringOps.scala:31)
        at org.apache.spark.examples.SparkPi$.main(SparkPi.scala:29)
        at org.apache.spark.examples.SparkPi.main(SparkPi.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:497)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
15/06/18 16:39:07 INFO SparkContext: Invoking stop() from shutdown hook
15/06/18 16:39:07 INFO SparkUI: Stopped Spark web UI at http://192.168.7.244:4040
15/06/18 16:39:07 INFO DAGScheduler: Stopping DAGScheduler
15/06/18 16:39:07 INFO SparkDeploySchedulerBackend: Shutting down all executors
15/06/18 16:39:07 INFO SparkDeploySchedulerBackend: Asking each executor to shut down
15/06/18 16:39:07 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
15/06/18 16:39:07 INFO Utils: path = /tmp/spark-c7749c63-d53e-4bd8-a377-ddadbfdeeed3/blockmgr-851c4a8c-8f89-4c22-b81f-9fe095e941d8, already present as root for deletion.
15/06/18 16:39:07 INFO MemoryStore: MemoryStore cleared
15/06/18 16:39:07 INFO BlockManager: BlockManager stopped
15/06/18 16:39:07 INFO BlockManagerMaster: BlockManagerMaster stopped
15/06/18 16:39:07 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
15/06/18 16:39:07 INFO SparkContext: Successfully stopped SparkContext
15/06/18 16:39:07 INFO Utils: Shutdown hook called
15/06/18 16:39:07 INFO Utils: Deleting directory /tmp/spark-c7749c63-d53e-4bd8-a377-ddadbfdeeed3

不是最后应该返回Pi is roughly 3.1444 么
才开始弄spark 不清楚这是什么情况
您需要登录后才可以回帖 登录 | 注册

本版积分规则 发表回复

  

北京盛拓优讯信息技术有限公司. 版权所有 京ICP备16024965号-6 北京市公安局海淀分局网监中心备案编号:11010802020122 niuxiaotong@pcpop.com 17352615567
未成年举报专区
中国互联网协会会员  联系我们:huangweiwei@itpub.net
感谢所有关心和支持过ChinaUnix的朋友们 转载本站内容请注明原作者名及出处

清除 Cookies - ChinaUnix - Archiver - WAP - TOP