数仓分层ods_跨境电商国内中转仓

数仓分层ods_跨境电商国内中转仓一、ods层介绍1、保持数据原貌不做任何修改,起到备份数据的作用。2、数据采用LZO压缩,减少磁盘存储空间。100G数据可以压缩到10G以内。3、创建分区表,防止后续的全表扫描,在企业开发中大量使用分区表。4、创建外部表,在企业开发中,除了自己用的临时表,创建内部表外,绝大多数场景都是创建外部表。二、用户行为数据1、启动日志表ods_start_log//创建启动日志…

大家好,又见面了,我是你们的朋友全栈君。如果您正在找激活码,请点击查看最新教程,关注关注公众号 “全栈程序员社区” 获取激活教程,可能之前旧版本教程已经失效.最新Idea2022.1教程亲测有效,一键激活。

Jetbrains全系列IDE稳定放心使用

一、ods 层介绍

1、保持数据原貌不做任何修改,起到备份数据的作用。
2、数据采用 LZO 压缩,减少磁盘存储空间。100G 数据可以压缩到 10G 以内。
3、创建分区表,防止后续的全表扫描,在企业开发中大量使用分区表。
4、创建外部表,在企业开发中,除了自己用的临时表,创建内部表外,绝大多数场景都是创建外部表。

二、用户行为数据

1、启动日志表 ods_start_log

// 创建启动日志表 ods_start_log
// 创建输入数据是 lzo,输出是 text,支持 json 解析的分区表
DROP table if exists ods_start_log;
CREATE EXTERNAL TABLE ods_start_log (`line` string) 
PARTITIONED BY (`dt` string)
STORED AS 
	INPUTFORMAT 'com.hadoop.mapred.DeprecatedLzoTextInputFormat' 
	OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat' 
LOCATION '/warehouse/gmall/ods/ods_start_log';

// 加载数据
load data inpath '/origin_data/gmall/log/topic_start/2020-03-10' into table gmall.ods_start_log partition(dt='2020-03-10');

// 为 lzo 压缩文件创建索引
hadoop jar /opt/module/hadoop-2.7.2/share/hadoop/common/hadoop-lzo-0.4.20.jar com.hadoop.compression.lzo.DistributedLzoIndexer /warehouse/gmall/ods/ods_start_log/dt=2020-03-10

2、事件日志表 ods_event_log

// 创建事件日志表 ods_event_log
// 创建输入数据是 lzo,输出是 text,支持 json 解析的分区表
drop table if exists ods_event_log; 
CREATE EXTERNAL TABLE ods_event_log(`line` string) 
PARTITIONED BY (`dt` string) 
STORED AS 
	INPUTFORMAT 'com.hadoop.mapred.DeprecatedLzoTextInputFormat' 
	OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat' 
LOCATION '/warehouse/gmall/ods/ods_event_log';

// 加载数据
load data inpath '/origin_data/gmall/log/topic_event/2020-03-10' into table gmall.ods_event_log partition(dt='2020-03-10'); 

// 为 lzo 压缩文件创建索引
hadoop jar /opt/module/hadoop-2.7.2/share/hadoop/common/hadoop-lzo-0.4.20.jar com.hadoop.compression.lzo.DistributedLzoIndexer /warehouse/gmall/ods/ods_event_log/dt=2020-03-10

3、ods层 用户行为数据加载脚本 hdfs_to_ods_log.sh

#!/bin/bash 

# 定义变量方便修改 
APP=gmall 
hive=/opt/module/hive/bin/hive 

# 如果是输入的日期按照取输入日期;如果没输入日期取当前时间的前一天 
if [ -n "$1" ] ;then 
do_date=$1 
else 
do_date=`date -d "-1 day" +%F` 
fi 

echo "===日志日期为 $do_date===" 

sql=" load data inpath '/origin_data/gmall/log/topic_start/$do_date' overwrite into table ${APP}.ods_start_log partition(dt='$do_date'); load data inpath '/origin_data/gmall/log/topic_event/$do_date' overwrite into table ${APP}.ods_event_log partition(dt='$do_date'); "
$hive -e "$sql" 

hadoop jar /opt/module/hadoop-2.7.2/share/hadoop/common/hadoop-lzo-0.4.20.jar com.hadoop.compression.lzo.DistributedLzoIndexer /warehouse/gmall/ods/ods_start_log/dt=$do_date
hadoop jar /opt/module/hadoop-2.7.2/share/hadoop/common/hadoop-lzo-0.4.20.jar com.hadoop.compression.lzo.DistributedLzoIndexer /warehouse/gmall/ods/ods_event_log/dt=$do_date 

4、为什么要对 lzo 数据创建索引?
lzo 文件默认不支持 split,创建索引后支持 split,这样作为 map 输入时就可以将文件分割成多个 map,否则只能有一个 map。

三、业务数据

1、订单表 ods_order_info (增量及更新)

drop table if exists ods_order_info;
create external table ods_order_info ( 
	`id` string COMMENT '订单号', 
	`final_total_amount` decimal(10,2) COMMENT '订单金额', 
	`order_status` string COMMENT '订单状态', 
	`user_id` string COMMENT '用户id', 
	`out_trade_no` string COMMENT '支付流水号', 
	`create_time` string COMMENT '创建时间', 
	`operate_time` string COMMENT '操作时间', 
	`province_id` string COMMENT '省份ID', 
	`benefit_reduce_amount` decimal(10,2) COMMENT '优惠金额', 
	`original_total_amount` decimal(10,2) COMMENT '原价金额', 
	`feight_fee` decimal(10,2) COMMENT '运费' 
) COMMENT '订单表' 
PARTITIONED BY (`dt` string) 
row format delimited fields terminated by '\t' 
STORED AS 
	INPUTFORMAT 'com.hadoop.mapred.DeprecatedLzoTextInputFormat' 
	OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat' 
location '/warehouse/gmall/ods/ods_order_info/';

2、订单详情表 ods_order_detail (增量)

drop table if exists ods_order_detail; 
create external table ods_order_detail( 
	`id` string COMMENT '订单编号', 
	`order_id` string COMMENT '订单号', 
	`user_id` string COMMENT '用户id', 
	`sku_id` string COMMENT '商品id', 
	`sku_name` string COMMENT '商品名称', 
	`order_price` decimal(10,2) COMMENT '商品价格', 
	`sku_num` bigint COMMENT '商品数量', 
	`create_time` string COMMENT '创建时间' 
) COMMENT '订单详情表' 
PARTITIONED BY (`dt` string) 
row format delimited fields terminated by '\t' 
STORED AS 
	INPUTFORMAT 'com.hadoop.mapred.DeprecatedLzoTextInputFormat' 
	OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat' 
location '/warehouse/gmall/ods/ods_order_detail/';

3、sku 商品表 ods_sku_info (全量)

drop table if exists ods_sku_info; 
create external table ods_sku_info( 
	`id` string COMMENT 'skuId', 
	`spu_id` string COMMENT 'spuid', 
	`price` decimal(10,2) COMMENT '价格', 
	`sku_name` string COMMENT '商品名称', 
	`sku_desc` string COMMENT '商品描述', 
	`weight` string COMMENT '重量', 
	`tm_id` string COMMENT '品牌id', 
	`category3_id` string COMMENT '品类id', 
	`create_time` string COMMENT '创建时间' 
) COMMENT 'SKU商品表' 
PARTITIONED BY (`dt` string) 
row format delimited fields terminated by '\t' 
STORED AS 
	INPUTFORMAT 'com.hadoop.mapred.DeprecatedLzoTextInputFormat' 
	OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat'
location '/warehouse/gmall/ods/ods_sku_info/';

4、用户表 ods_user_info (增量及更新)

drop table if exists ods_user_info; 
create external table ods_user_info( 
	`id` string COMMENT '用户id', 
	`name` string COMMENT '姓名', 
	`birthday` string COMMENT '生日', 
	`gender` string COMMENT '性别', 
	`email` string COMMENT '邮箱', 
	`user_level` string COMMENT '用户等级', 
	`create_time` string COMMENT '创建时间', 
	`operate_time` string COMMENT '操作时间' 
) COMMENT '用户表' 
PARTITIONED BY (`dt` string) 
row format delimited fields terminated by '\t' 
STORED AS 
	INPUTFORMAT 'com.hadoop.mapred.DeprecatedLzoTextInputFormat' 
	OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat' 
location '/warehouse/gmall/ods/ods_user_info/';

5、商品一级分类表 ods_base_category1 (全量)

drop table if exists ods_base_category1; 
create external table ods_base_category1( 
	`id` string COMMENT 'id', 
	`name` string COMMENT '名称' 
) COMMENT '商品一级分类表' 
PARTITIONED BY (`dt` string) 
row format delimited fields terminated by '\t' 
STORED AS 
	INPUTFORMAT 'com.hadoop.mapred.DeprecatedLzoTextInputFormat' 
	OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat' 
location '/warehouse/gmall/ods/ods_base_category1/';

6、商品二级分类表 ods_base_category2 (全量)

drop table if exists ods_base_category2; 
create external table ods_base_category2( 
	`id` string COMMENT ' id', 
	`name` string COMMENT '名称', 
	category1_id string COMMENT '一级品类id' 
) COMMENT '商品二级分类表' 
PARTITIONED BY (`dt` string) 
row format delimited fields terminated by '\t' 
STORED AS 
	INPUTFORMAT 'com.hadoop.mapred.DeprecatedLzoTextInputFormat' 
	OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat' 
location '/warehouse/gmall/ods/ods_base_category2/';

7、商品三级分类表 ods_base_category3 (全量)

drop table if exists ods_base_category3; 
create external table ods_base_category3( 
	`id` string COMMENT ' id', 
	`name` string COMMENT '名称', 
	category2_id string COMMENT '二级品类id'
) COMMENT '商品三级分类表' 
PARTITIONED BY (`dt` string) 
row format delimited fields terminated by '\t' 
STORED AS 
	INPUTFORMAT 'com.hadoop.mapred.DeprecatedLzoTextInputFormat' 
	OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat' 
location '/warehouse/gmall/ods/ods_base_category3/';

8、支付流水表 ods_payment_info (增量)

drop table if exists ods_payment_info; 
create external table ods_payment_info( 
	`id` bigint COMMENT '编号', 
	`out_trade_no` string COMMENT '对外业务编号', 
	`order_id` string COMMENT '订单编号', 
	`user_id` string COMMENT '用户编号', 
	`alipay_trade_no` string COMMENT '支付宝交易流水编号', 
	`total_amount` decimal(16,2) COMMENT '支付金额', 
	`subject` string COMMENT '交易内容', 
	`payment_type` string COMMENT '支付类型', 
	`payment_time` string COMMENT '支付时间' 
) COMMENT '支付流水表' 
PARTITIONED BY (`dt` string) 
row format delimited fields terminated by '\t' 
STORED AS 
	INPUTFORMAT 'com.hadoop.mapred.DeprecatedLzoTextInputFormat' 
	OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat' 
location '/warehouse/gmall/ods/ods_payment_info/';

9、省份表 ods_base_province (特殊)

drop table if exists ods_base_province; 
create external table ods_base_province ( 
	`id` bigint COMMENT '编号', 
	`name` string COMMENT '省份名称', 
	`region_id` string COMMENT '地区ID', 
	`area_code` string COMMENT '地区编码', 
	`iso_code` string COMMENT 'iso编码' 
) COMMENT '省份表' 
row format delimited fields terminated by '\t' 
STORED AS 
	INPUTFORMAT 'com.hadoop.mapred.DeprecatedLzoTextInputFormat' 
	OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat' 
location '/warehouse/gmall/ods/ods_base_province/';

10、地区表 ods_base_region (特殊)

drop table if exists ods_base_region; 
create external table ods_base_region ( 
	`id` bigint COMMENT '编号', 
	`region_name` string COMMENT '地区名称' 
) COMMENT '地区表' 
row format delimited fields terminated by '\t' 
STORED AS 
	INPUTFORMAT 'com.hadoop.mapred.DeprecatedLzoTextInputFormat' 
	OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat' 
location '/warehouse/gmall/ods/ods_base_region/';

11、品牌表 ods_base_trademark (全量)

drop table if exists ods_base_trademark; 
create external table ods_base_trademark ( 
	`tm_id` bigint COMMENT '编号', 
	`tm_name` string COMMENT '品牌名称' 
) COMMENT '品牌表' 
PARTITIONED BY (`dt` string) 
row format delimited fields terminated by '\t' 
STORED AS 
	INPUTFORMAT 'com.hadoop.mapred.DeprecatedLzoTextInputFormat' 
	OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat' 
location '/warehouse/gmall/ods/ods_base_trademark/';

12、订单状态表 ods_order_status_log (增量)

drop table if exists ods_order_status_log; 
create external table ods_order_status_log ( 
	`id` bigint COMMENT '编号', 
	`order_id` string COMMENT '订单ID', 
	`order_status` string COMMENT '订单状态', 
	`operate_time` string COMMENT '修改时间' 
) COMMENT '订单状态表' 
PARTITIONED BY (`dt` string) 
row format delimited fields terminated by '\t' 
STORED AS 
	INPUTFORMAT 'com.hadoop.mapred.DeprecatedLzoTextInputFormat' 
	OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat' 
location '/warehouse/gmall/ods/ods_order_status_log/';

13、spu 商品表 ods_spu_info (全量)

drop table if exists ods_spu_info; 
create external table ods_spu_info( 
	`id` string COMMENT 'spuid', 
	`spu_name` string COMMENT 'spu名称', 
	`category3_id` string COMMENT '品类id', 
	`tm_id` string COMMENT '品牌id' 
) COMMENT 'SPU商品表' 
PARTITIONED BY (`dt` string) 
row format delimited fields terminated by '\t' 
STORED AS 
	INPUTFORMAT 'com.hadoop.mapred.DeprecatedLzoTextInputFormat' 
	OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat' 
location '/warehouse/gmall/ods/ods_spu_info/';

14、商品评论表 ods_comment_info (增量)

drop table if exists ods_comment_info; 
create external table ods_comment_info( 
	`id` string COMMENT '编号', 
	`user_id` string COMMENT '用户ID', 
	`sku_id` string COMMENT '商品sku', 
	`spu_id` string COMMENT '商品spu', 
	`order_id` string COMMENT '订单ID', 
	`appraise` string COMMENT '评价', 
	`create_time` string COMMENT '评价时间'
) COMMENT '商品评论表' 
PARTITIONED BY (`dt` string) 
row format delimited fields terminated by '\t' 
STORED AS 
	INPUTFORMAT 'com.hadoop.mapred.DeprecatedLzoTextInputFormat' 
	OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat' 
location '/warehouse/gmall/ods/ods_comment_info/';

15、退单表 ods_order_refund_info (增量)

drop table if exists ods_order_refund_info; 
create external table ods_order_refund_info( 
	`id` string COMMENT '编号', 
	`user_id` string COMMENT '用户ID', 
	`order_id` string COMMENT '订单ID', 
	`sku_id` string COMMENT '商品ID', 
	`refund_type` string COMMENT '退款类型', 
	`refund_num` bigint COMMENT '退款件数', 
	`refund_amount` decimal(16,2) COMMENT '退款金额', 
	`refund_reason_type` string COMMENT '退款原因类型', 
	`create_time` string COMMENT '退款时间' 
) COMMENT '退单表' 
PARTITIONED BY (`dt` string) 
row format delimited fields terminated by '\t' 
STORED AS 
	INPUTFORMAT 'com.hadoop.mapred.DeprecatedLzoTextInputFormat' 
	OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat' 
location '/warehouse/gmall/ods/ods_order_refund_info/';

16、加购表 ods_cart_info (全量)

drop table if exists ods_cart_info; 
create external table ods_cart_info( 
	`id` string COMMENT '编号', 
	`user_id` string COMMENT '用户id', 
	`sku_id` string COMMENT 'skuid', 
	`cart_price` string COMMENT '放入购物车时价格', 
	`sku_num` string COMMENT '数量', 
	`sku_name` string COMMENT 'sku名称 (冗余)', 
	`create_time` string COMMENT '创建时间', 
	`operate_time` string COMMENT '修改时间', 
	`is_ordered` string COMMENT '是否已经下单', 
	`order_time` string COMMENT '下单时间' 
) COMMENT '加购表' 
PARTITIONED BY (`dt` string) 
row format delimited fields terminated by '\t' 
STORED AS 
	INPUTFORMAT 'com.hadoop.mapred.DeprecatedLzoTextInputFormat' 
	OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat' 
location '/warehouse/gmall/ods/ods_cart_info/';

17、商品收藏表 ods_favor_info (全量)

drop table if exists ods_favor_info; 
create external table ods_favor_info( 
	`id` string COMMENT '编号', 
	`user_id` string COMMENT '用户id', 
	`sku_id` string COMMENT 'skuid',
	`spu_id` string COMMENT 'spuid', 
	`is_cancel` string COMMENT '是否取消', 
	`create_time` string COMMENT '收藏时间', 
	`cancel_time` string COMMENT '取消时间' 
) COMMENT '商品收藏表' 
PARTITIONED BY (`dt` string) 
row format delimited fields terminated by '\t' 
STORED AS 
	INPUTFORMAT 'com.hadoop.mapred.DeprecatedLzoTextInputFormat' 
	OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat' 
location '/warehouse/gmall/ods/ods_favor_info/';

18、优惠券领用表 ods_coupon_use (新增及变化)

drop table if exists ods_coupon_use; 
create external table ods_coupon_use( 
	`id` string COMMENT '编号', 
	`coupon_id` string COMMENT '优惠券ID', 
	`user_id` string COMMENT 'skuid', 
	`order_id` string COMMENT 'spuid', 
	`coupon_status` string COMMENT '优惠券状态', 
	`get_time` string COMMENT '领取时间', 
	`using_time` string COMMENT '使用时间(下单)', 
	`used_time` string COMMENT '使用时间(支付)' 
) COMMENT '优惠券领用表' 
PARTITIONED BY (`dt` string) 
row format delimited fields terminated by '\t' 
STORED AS 
	INPUTFORMAT 'com.hadoop.mapred.DeprecatedLzoTextInputFormat' 
	OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat' 
location '/warehouse/gmall/ods/ods_coupon_use/';

19、优惠券表 ods_coupon_info (全量)

drop table if exists ods_coupon_info; 
create external table ods_coupon_info( 
	`id` string COMMENT '购物券编号', 
	`coupon_name` string COMMENT '购物券名称', 
	`coupon_type` string COMMENT '购物券类型 1 现金券 2 折扣券 3 满减券 4 满件打折券', 
	`condition_amount` string COMMENT '满额数', 
	`condition_num` string COMMENT '满件数', 
	`activity_id` string COMMENT '活动编号', 
	`benefit_amount` string COMMENT '减金额', 
	`benefit_discount` string COMMENT '折扣', 
	`create_time` string COMMENT '创建时间', 
	`range_type` string COMMENT '范围类型 1、商品 2、品类 3、品牌', 
	`spu_id` string COMMENT '商品id', 
	`tm_id` string COMMENT '品牌id', 
	`category3_id` string COMMENT '品类id', 
	`limit_num` string COMMENT '最多领用次数', 
	`operate_time` string COMMENT '修改时间', 
	`expire_time` string COMMENT '过期时间' 
) COMMENT '优惠券表' 
PARTITIONED BY (`dt` string) 
row format delimited fields terminated by '\t' 
STORED AS 
	INPUTFORMAT 'com.hadoop.mapred.DeprecatedLzoTextInputFormat' 
	OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat' 
location '/warehouse/gmall/ods/ods_coupon_info/';

20、活动表 ods_activity_info (全量)

drop table if exists ods_activity_info; 
create external table ods_activity_info( 
	`id` string COMMENT '编号', 
	`activity_name` string COMMENT '活动名称', 
	`activity_type` string COMMENT '活动类型', 
	`start_time` string COMMENT '开始时间', 
	`end_time` string COMMENT '结束时间', 
	`create_time` string COMMENT '创建时间' 
) COMMENT '活动表' 
PARTITIONED BY (`dt` string) 
row format delimited fields terminated by '\t' 
STORED AS 
	INPUTFORMAT 'com.hadoop.mapred.DeprecatedLzoTextInputFormat' 
	OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat' 
location '/warehouse/gmall/ods/ods_activity_info/';

21、活动订单关联表 ods_activity_order (增量)

drop table if exists ods_activity_order; 
create external table ods_activity_order( 
	`id` string COMMENT '编号', 
	`activity_id` string COMMENT '优惠券ID', 
	`order_id` string COMMENT 'skuid', 
	`create_time` string COMMENT '领取时间' 
) COMMENT '活动订单关联表' 
PARTITIONED BY (`dt` string) 
row format delimited fields terminated by '\t' 
STORED AS 
	INPUTFORMAT 'com.hadoop.mapred.DeprecatedLzoTextInputFormat' 
	OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat' 
location '/warehouse/gmall/ods/ods_activity_order/';

22、优惠规则表 ods_activity_rule (全量)

drop table if exists ods_activity_rule; 
create external table ods_activity_rule( 
	`id` string COMMENT '编号', 
	`activity_id` string COMMENT '活动ID', 
	`condition_amount` string COMMENT '满减金额', 
	`condition_num` string COMMENT '满减件数', 
	`benefit_amount` string COMMENT '优惠金额', 
	`benefit_discount` string COMMENT '优惠折扣', 
	`benefit_level` string COMMENT '优惠级别' 
) COMMENT '优惠规则表' 
PARTITIONED BY (`dt` string) 
row format delimited fields terminated by '\t' 
STORED AS 
	INPUTFORMAT 'com.hadoop.mapred.DeprecatedLzoTextInputFormat' 
	OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat' 
location '/warehouse/gmall/ods/ods_activity_rule/';

23、编码字典表 ods_base_dic (全量)

drop table if exists ods_base_dic; 
create external table ods_base_dic(
	`dic_code` string COMMENT '编号', 
	`dic_name` string COMMENT '编码名称', 
	`parent_code` string COMMENT '父编码', 
	`create_time` string COMMENT '创建日期', 
	`operate_time` string COMMENT '操作日期' 
) COMMENT '编码字典表' 
PARTITIONED BY (`dt` string) 
row format delimited fields terminated by '\t' 
STORED AS 
	INPUTFORMAT 'com.hadoop.mapred.DeprecatedLzoTextInputFormat' 
	OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat' 
location '/warehouse/gmall/ods/ods_base_dic/';

24、ods 层业务数据加载脚本 hdfs_to_ods_db.sh

#!/bin/bash 
APP=gmall 
hive=/opt/module/hive/bin/hive 
# 如果是输入的日期按照取输入日期;如果没输入日期取当前时间的前一天 
if [ -n "$2" ] ;then 
do_date=$2 
else 
do_date=`date -d "-1 day" +%F` 
fi 
sql1=" load data inpath '/origin_data/$APP/db/order_info/$do_date' OVERWRITE into table ${APP}.ods_order_info partition(dt='$do_date'); load data inpath '/origin_data/$APP/db/order_detail/$do_date' OVERWRITE into table ${APP}.ods_order_detail partition(dt='$do_date'); load data inpath '/origin_data/$APP/db/sku_info/$do_date' OVERWRITE into table ${APP}.ods_sku_info partition(dt='$do_date'); load data inpath '/origin_data/$APP/db/user_info/$do_date' OVERWRITE into table ${APP}.ods_user_info partition(dt='$do_date'); load data inpath '/origin_data/$APP/db/payment_info/$do_date' OVERWRITE into table ${APP}.ods_payment_info partition(dt='$do_date'); load data inpath '/origin_data/$APP/db/base_category1/$do_date' OVERWRITE into table ${APP}.ods_base_category1 partition(dt='$do_date'); load data inpath '/origin_data/$APP/db/base_category2/$do_date' OVERWRITE into table ${APP}.ods_base_category2 partition(dt='$do_date'); load data inpath '/origin_data/$APP/db/base_category3/$do_date' OVERWRITE into table ${APP}.ods_base_category3 partition(dt='$do_date'); load data inpath '/origin_data/$APP/db/base_trademark/$do_date' OVERWRITE into table ${APP}.ods_base_trademark partition(dt='$do_date'); load data inpath '/origin_data/$APP/db/activity_info/$do_date' OVERWRITE into table ${APP}.ods_activity_info partition(dt='$do_date'); load data inpath '/origin_data/$APP/db/activity_order/$do_date' OVERWRITE into table ${APP}.ods_activity_order partition(dt='$do_date'); load data inpath '/origin_data/$APP/db/cart_info/$do_date' OVERWRITE into table ${APP}.ods_cart_info partition(dt='$do_date'); load data inpath '/origin_data/$APP/db/comment_info/$do_date' OVERWRITE into table ${APP}.ods_comment_info partition(dt='$do_date'); load data inpath '/origin_data/$APP/db/coupon_info/$do_date' OVERWRITE into table ${APP}.ods_coupon_info partition(dt='$do_date'); load data inpath '/origin_data/$APP/db/coupon_use/$do_date' OVERWRITE into table ${APP}.ods_coupon_use partition(dt='$do_date'); load data inpath '/origin_data/$APP/db/favor_info/$do_date' OVERWRITE into table ${APP}.ods_favor_info partition(dt='$do_date'); load data inpath '/origin_data/$APP/db/order_refund_info/$do_date' OVERWRITE into table ${APP}.ods_order_refund_info partition(dt='$do_date'); load data inpath '/origin_data/$APP/db/order_status_log/$do_date' OVERWRITE into table ${APP}.ods_order_status_log partition(dt='$do_date'); load data inpath '/origin_data/$APP/db/spu_info/$do_date' OVERWRITE into table ${APP}.ods_spu_info partition(dt='$do_date'); load data inpath '/origin_data/$APP/db/activity_rule/$do_date' OVERWRITE into table ${APP}.ods_activity_rule partition(dt='$do_date'); load data inpath '/origin_data/$APP/db/base_dic/$do_date' OVERWRITE into table ${APP}.ods_base_dic partition(dt='$do_date'); " 
sql2=" load data inpath '/origin_data/$APP/db/base_province/$do_date' OVERWRITE into table ${APP}.ods_base_province; load data inpath '/origin_data/$APP/db/base_region/$do_date' OVERWRITE into table ${APP}.ods_base_region; " 
case $1 in
"first"){ 

$hive -e "$sql1"
$hive -e "$sql2"
};; 
"all"){ 

$hive -e "$sql1"
};; 
esac

四、ods 层总结

1、注意用 lzo 压缩、可以大幅度减少磁盘存储空间。
2、要保持数据原貌不做更改,字段要和原数据符合,起到备份数据的作用。
3、创建分区表,方便扫描。
4、注意表的类型,是全量表、增量表、新增及变化表还是特殊表。
5、要注意为 lzo 文件创建索引,使得 lzo 数据支持切片。
6、ods 层用户行为表 2 张,业务数据表 23 张,ods 层共计 25 张表。

版权声明:本文内容由互联网用户自发贡献,该文观点仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 举报,一经查实,本站将立刻删除。

发布者:全栈程序员-用户IM,转载请注明出处:https://javaforall.cn/185406.html原文链接:https://javaforall.cn

【正版授权,激活自己账号】: Jetbrains全家桶Ide使用,1年售后保障,每天仅需1毛

【官方授权 正版激活】: 官方授权 正版激活 支持Jetbrains家族下所有IDE 使用个人JB账号...

(0)


相关推荐

  • mysql json decode_json_decode函数详解

    mysql json decode_json_decode函数详解json_decode是php5.2.0之后新增的一个PHP内置函数,其作用是对JSON格式的字符串进行编码.那么这个函数该如何使用呢?json_decode的语法规则:json_decode(string$json[,bool$assoc=false[,int$depth=512[,int$options=0]]])json_decode接受一个JSON格…

  • VS 2017安装教程

    VS 2017安装教程 1、首先下载安装包,地址是:https://www.microsoft.com/zh-cn/download/,进入首页后选择开发人员工具,进入开发人员工具后即可下载VS,VS有三个版本,分别是社区版、专业版、企业版。我选择的是社区版。点击下载VisualStudio(蓝色底纹)左下侧的发行说明,即可下载以前的旧版本(https://my.visualstudio.com/download…

  • iview中Modal弹窗做form表单验证相关问题[通俗易懂]

    iview中Modal弹窗做form表单验证相关问题[通俗易懂]iview中Modal弹窗做form表单验证相关问题

  • VSCode安装教程(超详细)[通俗易懂]

    VSCode安装教程(超详细)[通俗易懂]VSCode安装教程(超详细)下载安装一、同意协议(废话了我)二、选择合适的安装位置,下一步三、下一步四、这里注意下,进行相关的选择五、点击安装六、等待安装完成,很快配置中文界面上面安装完成后会出现下面的界面,我们搜索Chinese,点击install然后Restart重启后就ok了,中文界面下载下载地址:DownloadVisualStudioCode选择相应的版本下载。安装跟着图一步步走,简单明了。一、同意协议(废话了我)二、选择合适的安装位置,下一步三、下一步四

  • pycharm 查找替换_word查找和替换功能可以实现

    pycharm 查找替换_word查找和替换功能可以实现方法一:快捷键:ctr(control)+shift+r(replace:替换)方法二:

  • charles乱码怎么解决_抓包精灵乱码

    charles乱码怎么解决_抓包精灵乱码前言当使用Charles抓包时,发现数据都是乱码,这时需要安装证书解决办法1.点击charles窗口,点击左上角Help->SSLProxying→InstallCharles

发表回复

您的电子邮箱地址不会被公开。

关注全栈程序员社区公众号