This article is a mirror article of machine translation, please click here to jump to the original article.

View: 19230|Reply: 0

java.io.FileNotFoundException: HADOOP_HOME and hadoop.home.dir are unset.

[Copy link]
Posted on 2/26/2019 1:37:10 PM | | | |
java.io.FileNotFoundException: java.io.FileNotFoundException: HADOOP_HOME and hadoop.home.dir are unset. -see https://wiki.apache.org/hadoop/WindowsProblems
        at org.apache.hadoop.util.Shell.fileNotFoundException(Shell.java:528) ~[hadoop-common-2.8.4.jar:na]
        at org.apache.hadoop.util.Shell.getHadoopHomeDir(Shell.java:549) ~[hadoop-common-2.8.4.jar:na]
        at org.apache.hadoop.util.Shell.getQualifiedBin(Shell.java:572) ~[hadoop-common-2.8.4.jar:na]
        at org.apache.hadoop.util.Shell. <clinit>(Shell.java:669) ~[hadoop-common-2.8.4.jar:na]
        at org.apache.hadoop.util.StringUtils. <clinit>(StringUtils.java:79) [hadoop-common-2.8.4.jar:na]
        at org.apache.hadoop.conf.Configuration.getBoolean(Configuration.java:1555) [hadoop-common-2.8.4.jar:na]
        at org.apache.hadoop.hbase.HBaseConfiguration.checkDefaultsVersion(HBaseConfiguration.java:66) [hbase-common-2.0.0.jar:2.0.0]
        at org.apache.hadoop.hbase.HBaseConfiguration.addHbaseResources(HBaseConfiguration.java:80) [hbase-common-2.0.0.jar:2.0.0]
        at org.apache.hadoop.hbase.HBaseConfiguration.create(HBaseConfiguration.java:94) [hbase-common-2.0.0.jar:2.0.0]
        at org.apache.phoenix.query.ConfigurationFactory$ConfigurationFactoryImpl$1.call(ConfigurationFactory.java:49) [phoenix-core-5.0.0-HBase-2.0.jar:5.0.0-HBase-2.0]
        at org.apache.phoenix.query.ConfigurationFactory$ConfigurationFactoryImpl$1.call(ConfigurationFactory.java:46) [phoenix-core-5.0.0-HBase-2.0.jar:5.0.0-HBase-2.0]
        at org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:76) [phoenix-core-5.0.0-HBase-2.0.jar:5.0.0-HBase-2.0]
        at org.apache.phoenix.util.PhoenixContextExecutor.callWithoutPropagation(PhoenixContextExecutor.java:91) [phoenix-core-5.0.0-HBase-2.0.jar:5.0.0-HBase-2.0]
        at org.apache.phoenix.query.ConfigurationFactory$ConfigurationFactoryImpl.getConfiguration(ConfigurationFactory.java:46) [phoenix-core-5.0.0-HBase-2.0.jar:5.0.0- HBase-2.0]
        at org.apache.phoenix.jdbc.PhoenixDriver.initializeConnectionCache(PhoenixDriver.java:151) [phoenix-core-5.0.0-HBase-2.0.jar:5.0.0-HBase-2.0]
        at org.apache.phoenix.jdbc.PhoenixDriver. <init>(PhoenixDriver.java:143) [phoenix-core-5.0.0-HBase-2.0.jar:5.0.0-HBase-2.0]


The server Hadoop version is as follows:

[root@master ~]# hadoop version
Hadoop 2.8.3
Subversionhttps://git-wip-us.apache.org/repos/asf/hadoop.git-r b3fe56402d908019d99af1f1f4fc65cb1d1436a2
Compiled by jdu on 2017-12-05T03:43Z
Compiled with protoc 2.5.0
From source with checksum 9ff4856d824e983fa510d3f843e3f19d
This command was run using /home/dzkj/apache/hadoop-2.8.3/share/hadoop/common/hadoop-common-2.8.3.jar


I've always thought that my local idea is a call to remote hadoop, so I don't need to install hadoop in the local Windows operating system. So when I saw this HADOOP_HOME, I didn't understand, do I still need to install a hadoop locally????

Answer: You don't need to install hadoop, but you need to configure the %HADOOP_HOME% variable.

Solution:

Download the corresponding winutils according to your server version

Winutils Introduction: Windows binaries for Hadoop versions, these are built directly from the same git commit used to create the official ASF version; They are checked out and built on a Windows VM that is dedicated to testing Hadoop/YARN applications on Windows. It is not a system for everyday use, so it is isolated from drive/email security attacks.


Link:https://github.com/steveloughran/winutils

Since my Hadoop version is 2.8.3, the download address is:https://github.com/steveloughran ... er/hadoop-2.8.3/bin

winutils.2.8.3.bin.zip (1.88 MB, Number of downloads: 15)

GitHub clone clone or download a folder of a repository
https://www.itsvse.com/thread-7086-1-1.html
(Source: Architect_Programmer)
Set the environment variable %HADOOP_HOME% to point to the directory above the BIN directory that contains the WINUTILS.EXE. As shown below:



Close the idea, reopen the project with the idea, start the project, and the exception disappears.

Reference links:https://wiki.apache.org/hadoop/WindowsProblems




Previous:js-window.onload can only be called once!
Next:discuz post method to automatically hide links
Disclaimer:
All software, programming materials or articles published by Code Farmer Network are only for learning and research purposes; The above content shall not be used for commercial or illegal purposes, otherwise, users shall bear all consequences. The information on this site comes from the Internet, and copyright disputes have nothing to do with this site. You must completely delete the above content from your computer within 24 hours of downloading. If you like the program, please support genuine software, purchase registration, and get better genuine services. If there is any infringement, please contact us by email.

Mail To:help@itsvse.com