配置了环境变量却依然报错Caused by: java.io.FileNotFoundException: HADOOP_HOME and hadoop.home.dir are unset.
收录于:18天前
Caused by: java.io.FileNotFoundException: HADOOP_HOME and hadoop.home.dir are unset.
java.io.FileNotFoundException: HADOOP_HOME and hadoop.home.dir are unset.
at org.apache.hadoop.util.Shell.checkHadoopHomeInner(Shell.java:467)
at org.apache.hadoop.util.Shell.checkHadoopHome(Shell.java:438)
at org.apache.hadoop.util.Shell.<clinit>(Shell.java:515)
at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:79)
at org.apache.hadoop.conf.Configuration.getBoolean(Configuration.java:1712)
at org.apache.hadoop.security.SecurityUtil.setConfigurationInternal(SecurityUtil.java:99)
at org.apache.hadoop.security.SecurityUtil.<clinit>(SecurityUtil.java:88)
at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:312)
at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:300)
at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:575)
at org.apache.spark.util.Utils$.$anonfun$getCurrentUserName$1(Utils.scala:2561)
at scala.Option.getOrElse(Option.scala:201)
at org.apache.spark.util.Utils$.getCurrentUserName(Utils.scala:2561)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:316)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:157)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:170)
at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:64)
at com.example.demo.componet.spark.sparkConnect.<init>(sparkConnect.java:15)
at com.example.demo.componet.spark.sparkConnect.main(sparkConnect.java:34)
18:11:09.346 [main] WARN org.apache.hadoop.util.Shell - Did not find winutils.exe: {
}
java.io.FileNotFoundException: java.io.FileNotFoundException: HADOOP_HOME and hadoop.home.dir are unset. -see https://wiki.apache.org/hadoop/WindowsProblems
at org.apache.hadoop.util.Shell.fileNotFoundException(Shell.java:547)
at org.apache.hadoop.util.Shell.getHadoopHomeDir(Shell.java:568)
at org.apache.hadoop.util.Shell.getQualifiedBin(Shell.java:591)
at org.apache.hadoop.util.Shell.<clinit>(Shell.java:688)
at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:79)
at org.apache.hadoop.conf.Configuration.getBoolean(Configuration.java:1712)
at org.apache.hadoop.security.SecurityUtil.setConfigurationInternal(SecurityUtil.java:99)
at org.apache.hadoop.security.SecurityUtil.<clinit>(SecurityUtil.java:88)
at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:312)
at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:300)
at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:575)
at org.apache.spark.util.Utils$.$anonfun$getCurrentUserName$1(Utils.scala:2561)
at scala.Option.getOrElse(Option.scala:201)
at org.apache.spark.util.Utils$.getCurrentUserName(Utils.scala:2561)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:316)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:157)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:170)
at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:64)
at com.example.demo.componet.spark.sparkConnect.<init>(sparkConnect.java:15)
at com.example.demo.componet.spark.sparkConnect.main(sparkConnect.java:34)
Caused by: java.io.FileNotFoundException: HADOOP_HOME and hadoop.home.dir are unset.
at org.apache.hadoop.util.Shell.checkHadoopHomeInner(Shell.java:467)
at org.apache.hadoop.util.Shell.checkHadoopHome(Shell.java:438)
at org.apache.hadoop.util.Shell.<clinit>(Shell.java:515)
... 16 common frames omitted
18:11:09.348 [main] DEBUG org.apache.hadoop.util.Shell - Failed to find winutils.exe
Java通过spark提供的接口org.apache.spark.api.java
调用spark但是这仅限于在普通项目中。例如
windows上配置hadoop并通过idea连接本地spark和服务器spark本篇文章中介绍了在普通Maven项目如何使用spark。但当同样的项目移植到spring boot时就行不通了。老是包如标题的错误。
[org.apache.hadoop.util.Shell] - Failed to detect a valid hadoop home directory
java.io.IOException : HADOOP_HOME or hadoop.home.dir are not set.
环境变量可能已经配置了,但还是找不到路径。
在代码中手动添加路径:
System.setProperty("hadoop.home.dir","D:\\SoftWares\\Apache\\spark-3.3.1-bin-hadoop3");
. . .
相关推荐
ads via 小工具