I am running my spark application using spark-submit in yarn-cluster mode. But it exits with following execption (full log using yarn logs).
What could be the problem?
LogType:stderr Log Upload Time:Fri Jan 12 11:29:24 +0100 2018 LogLength:1835 Log Contents: Exception in thread "main" java.lang.UnsupportedClassVersionError: it/polito/bigdata/spark/example/SparkDriver : Unsupported major.minor version 52.0 at java.lang.ClassLoader.defineClass1(Native Method) at java.lang.ClassLoader.defineClass(ClassLoader.java:800) at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142) at java.net.URLClassLoader.defineClass(URLClassLoader.java:449) at java.net.URLClassLoader.access$100(URLClassLoader.java:71) at java.net.URLClassLoader$1.run(URLClassLoader.java:361) at java.net.URLClassLoader$1.run(URLClassLoader.java:355) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:354) at java.lang.ClassLoader.loadClass(ClassLoader.java:425) at java.lang.ClassLoader.loadClass(ClassLoader.java:358) at org.apache.spark.deploy.yarn.ApplicationMaster.startUserApplication(ApplicationMaster.scala:546) at org.apache.spark.deploy.yarn.ApplicationMaster.runDriver(ApplicationMaster.scala:335) at org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:197) at org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$main$1.apply$mcV$sp(ApplicationMaster.scala:680) at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:69) at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:68) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1917) at org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:68) at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:678) at org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala)
LogType:stdout Log Upload Time:Fri Jan 12 11:29:24 +0100 2018 LogLength:0 Log Contents:
Thanks in advance.
Answer by Jay Kumar SenSharma · Jan 12 at 10:42 AM
The following error indicates that you are using Old JDK.
Exceptionin thread "main" java.lang.UnsupportedClassVersionError: it/polito/bigdata/spark/example/SparkDriver:Unsupported major.minor version 52.0
Pleas check if your JAVA_HOME variable is pointing to which JDK ?
It is better if you set the JAVA_HOME explicitly to JDK 1.8 (As JDK 1.7 is already declared as End Of Life Long back)
.
Even from HDP 2.6.3 onwards JDK 1.8 is made compulsory: https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.6.3/bk_support-matrices/content/ch_matrices-hdp.html
Answer by Cristiano Cavo · Jan 12 at 10:51 AM
I check the Java version on the server:
And then I chenged the Java version used to generate the JAR in my IDE:
I launched the new JAR without any errors.
It's ok now, thank you!
HCC Guidelines | HCC FAQs | HCC Privacy Policy
© 2011-2017 Hortonworks Inc. All Rights Reserved.
Hadoop, Falcon, Atlas, Sqoop, Flume, Kafka, Pig, Hive, HBase, Accumulo, Storm, Solr, Spark, Ranger, Knox, Ambari, ZooKeeper, Oozie and the Hadoop elephant logo are trademarks of the Apache Software Foundation.
Privacy Policy |
Terms of Service
HCC Guidelines | HCC FAQs | HCC Privacy Policy | Privacy Policy | Terms of Service
© 2011-2018 Hortonworks Inc. All Rights Reserved.
Hadoop, Falcon, Atlas, Sqoop, Flume, Kafka, Pig, Hive, HBase, Accumulo, Storm, Solr, Spark, Ranger, Knox, Ambari, ZooKeeper, Oozie and the Hadoop elephant logo are trademarks of the Apache Software Foundation.