登录    关于
马育民老师的博客

马育民的博客

QQ:65242847

hive教程:第一次使用sql

显示数据库

show databases;

后面要带上 ;

显示表

show tables;

创建表

create table t_user (id string,username string,password string);

注意:

  • user 不能作为表名

  • 字段类型:类似java的类型

hadoop web管理页面

访问:http://hadoop1:9870/explorer.html#/

默认存储路径:/user/hive/warehouse/表名

insert

insert into t_user(id,username,password) values (1,'lilei','123456');

执行结果:

Query ID = root_20210310114036_ff58bed7-aefd-4513-a3c5-799df25a29b5
Total jobs = 3
Launching Job 1 out of 3
Number of reduce tasks determined at compile time: 1
In order to change the average load for a reducer (in bytes):
  set hive.exec.reducers.bytes.per.reducer=<number>
In order to limit the maximum number of reducers:
  set hive.exec.reducers.max=<number>
In order to set a constant number of reducers:
  set mapreduce.job.reduces=<number>
Starting Job = job_1615348585747_0001, Tracking URL = http://localhost:8088/proxy/application_1615348585747_0001/
Kill Command = /devtools/hadoop-3.0.3/bin/mapred job  -kill job_1615348585747_0001
Hadoop job information for Stage-1: number of mappers: 1; number of reducers: 1
2021-03-10 11:56:43,723 Stage-1 map = 0%,  reduce = 0%
2021-03-10 11:56:51,091 Stage-1 map = 100%,  reduce = 0%, Cumulative CPU 1.81 sec
2021-03-10 11:56:57,310 Stage-1 map = 100%,  reduce = 100%, Cumulative CPU 2.84 sec
MapReduce Total cumulative CPU time: 2 seconds 840 msec
Ended Job = job_1615348585747_0001
Stage-4 is selected by condition resolver.
Stage-3 is filtered out by condition resolver.
Stage-5 is filtered out by condition resolver.
Moving data to directory hdfs://0.0.0.0:9000/user/hive/warehouse/t_user/.hive-staging_hive_2021-03-10_11-40-36_802_7176574248944213326-1/-ext-10000
Loading data to table default.t_user
MapReduce Jobs Launched:
Stage-Stage-1: Map: 1  Reduce: 1   Cumulative CPU: 2.84 sec   HDFS Read: 16121 HDFS Write: 286 SUCCESS
Total MapReduce CPU Time Spent: 2 seconds 840 msec
OK
Time taken: 983.177 seconds

select 查询

select * from t_user;

hadoop web管理页面

访问:http://localhost:9870

可以下载该文件,其内容就是 插入的数据

下载数据、查看

中间的 不是汉字,是 八进制编码 \001,文本编辑器无法正常显示,也就是说:列分隔符 是 八进制编码 \001

hive教程:分隔符


原文出处:https://malaoshi.top/show_1IXjZdDBV1W.html