Quantcast

Spark 2.0.1 & 2.1.1 fails on Hadoop-3.0.0-alhpa2

classic Classic list List threaded Threaded
1 message Options
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Spark 2.0.1 & 2.1.1 fails on Hadoop-3.0.0-alhpa2

yncxcw
This post has NOT been accepted by the mailing list yet.
hi, all

I just upgraded my Hadoop from 2.7.3 to 3.0.0-alpha2. My spark version is 2.0.1. It works well on Hadoop -2.7.3. However, I had this error output of driver log on 3.0.0:

SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/extend1/yarn-temp/nm-local-dir/usercache/admin/filecache/10/__spark_libs__3353962889587701453.zip/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/admin/hadoop-2.7.1/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
17/05/09 09:05:57 INFO util.SignalUtils: Registered signal handler for TERM
17/05/09 09:05:57 INFO util.SignalUtils: Registered signal handler for HUP
17/05/09 09:05:57 INFO util.SignalUtils: Registered signal handler for INT
17/05/09 09:05:57 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Exception in thread "main" java.io.IOException: Exception reading /extend1/yarn-temp/nm-local-dir/usercache/admin/appcache/application_1494291953232_0001/container_1494291953232_0001_02_000001/container_tokens
at org.apache.hadoop.security.Credentials.readTokenStorageFile(Credentials.java:198)
at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:816)
at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:760)
at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:633)
at org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:65)
at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:764)
at org.apache.spark.deploy.yarn.ExecutorLauncher$.main(ApplicationMaster.scala:787)
at org.apache.spark.deploy.yarn.ExecutorLauncher.main(ApplicationMaster.scala)
Caused by: java.io.IOException: Unknown version 1 in token storage.
at org.apache.hadoop.security.Credentials.readTokenStorageStream(Credentials.java:216)
at org.apache.hadoop.security.Credentials.readTokenStorageFile(Credentials.java:195)
... 7 more


PS, I have tries many times and also re-generated the data, the error is still there. Anyone  had this errors before ?


Wei
Loading...