Recently, the company needs to use hbase to store a large amount of data, and the manager has taught us a lesson about the relevant knowledge.
Environment:
System:
CentOS Linux release 7.5.1804 (Core)
Command:
Java version:
java version "1.8.0_191" Java(TM) SE Runtime Environment (build 1.8.0_191-b12) Java HotSpot(TM) 64-Bit Server VM (build 25.191-b12, mixed mode)
Command:
Hadoop version:
Hadoop 2.8.3 SubversionThe hyperlink login is visible.-r b3fe56402d908019d99af1f1f4fc65cb1d1436a2 Compiled by jdu on 2017-12-05T03:43Z Compiled with protoc 2.5.0 From source with checksum 9ff4856d824e983fa510d3f843e3f19d This command was run using /home/itsvse/apache/hadoop-2.8.3/share/hadoop/common/hadoop-common-2.8.3.jar
Command:
HBase version:
2.1.1, rb60a92d6864ef27295027f5961cb46f9162d7637, Fri Oct 26 19:27:03 PDT 2018
hbase shell command:
The HBase shell can be started using the following command
[root@master ~]# find / -name "hbase"
/home/itsvse/apache/hbase-2.1.1/docs/testapidocs/org/apache/hadoop/hbase
/home/itsvse/apache/hbase-2.1.1/docs/testapidocs/org/apache/hbase
/home/itsvse/apache/hbase-2.1.1/docs/testapidocs/src-html/org/apache/hadoop/hbase
/home/itsvse/apache/hbase-2.1.1/docs/testapidocs/src-html/org/apache/hbase
/home/itsvse/apache/hbase-2.1.1/docs/apidocs/org/apache/hadoop/hbase
/home/itsvse/apache/hbase-2.1.1/docs/apidocs/org/apache/hbase
/home/itsvse/apache/hbase-2.1.1/docs/apidocs/src-html/org/apache/hadoop/hbase
/home/itsvse/apache/hbase-2.1.1/docs/apidocs/src-html/org/apache/hbase
/home/itsvse/apache/hbase-2.1.1/bin/hbase
/home/itsvse/apache/hbase-2.1.1/lib/ruby/hbase [root@master ~]# cd /home/itsvse/apache/hbase-2.1.1/bin/ [root@master bin]# ./hbase shell
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/itsvse/apache/hadoop-2.8.3/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/itsvse/apache/hbase-2.1.1/lib/client-facing-thirdparty/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindingsfor an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
HBase Shell Use "help" to get list of supported commands. Use "exit" to quit this interactive shell. For Reference, please visit:http://hbase.apache.org/2.0/book.html#shell Version 2.1.1, rb60a92d6864ef27295027f5961cb46f9162d7637, Fri Oct 26 19:27:03 PDT 2018 Took 0.0050 seconds hbase(main):001:0> hbase(main):002:0* hbase(main):003:0* version 2.1.1, rb60a92d6864ef27295027f5961cb46f9162d7637, Fri Oct 26 19:27:03 PDT 2018 Took 0.0006 seconds hbase(main):004:0>
Introduction
After the Hadoop installation is successfully started, run the jps command to check whether the process starts successfully, if it is successful (not tested).
[hadoop@master ~]$jps
The masternode will appear: NameNode JobTracker SecondaryNameNode
slave1 node appears: DateNode TaskTracker
slave2node appears: DateNode TaskTracker
HBase version supported Hadoop version
S stands for support X is not supported NT stands for no test
Reference links:The hyperlink login is visible.
Some basic operation commands of HBase Shell are listed as follows:
| name | Command expressions | | See what tables exist | list | | Create a table | create 'table name', 'column name 1', 'column name 2', 'column name N' | | Add a record | put 'table name', 'row name', 'column name:', 'value' | | View the record | get 'table name', 'row name' | | View the total number of records in the table | count 'table name' | | Delete the record | delete 'table name', 'row name', 'column name' | | Delete a table | The table must be blocked before the table can be deleted, the first step is disable 'table name' The second step is to drop 'table name' | | View all records | scan "table name" | | See all the data in a certain column of a table | scan "table name" , ['column name:'] | | Update the record | It is to rewrite it and revert |
Understand
Hadoop Distributed File System (HDFS), HBase is a data storage project based on Hadoop, and Hive is used for data analysis.
(End)
|