Hadoop和Hive的实践应用(2)

hive是基于Hadoop的一个数据仓库工具,可以将结构化的数据文件映射为一张数据库表,并提供完整的sql查询功能,可以将sql语句转换为MapReduce任务进行运行。 其优点是学习成本低,可以通过类SQL语句快速实现简单的MapReduce统计,不必开发专门的MapReduce应用,十分适合数据仓库的统计分析。

我这里主要用于统计处理结构化的文本数据,处理后,存入结构化数据库中展现给用户。

安装:

1,下载wget

2,解压tar -zxvf hive-0.7.0.tar.gz(注意文件用户权限,和hadoop不一致要修改)

3,配置conf/hive-env.sh.template

export HADOOP_HOME=/usr/local/hadoop-0.20.203.0

export HIVE_HOME =/usr/local/hadoop-0.20.203.0/hive-0.7.0

export JAVA_HOME=/usr/local/jdk1.6.0_25

4,在hive目录下执行bin/hive测试成功是否

Hive history file=/tmp/tmg/hive_job_log_tmg_201105241042_506926685.txt
               hive>

执行show tables;

OK
               Time taken: 1.482 seconds

安装完成,是不是很easy!

但下边的的这些就要注意了:

1,用mysql存储数据源(修改数据源,用mysql存储源数据是为了在hiveserver启动的情况下,hive命令行下也能执行语句)
            配置mysql存储数据源,修改hive-default.xml中下面几项:
                javax.jdo.option.ConnectionURL
                 jdbc:mysql://<host name>/<database name>?createDatabaseIfNotExist=true
 
                javax.jdo.option.ConnectionDriverName
                com.mysql.jdbc.Driver
 
                 javax.jdo.option.ConnectionUserName
                <user name>
 
  
                 javax.jdo.option.ConnectionPassword
               <password>
  
               hive.metastore.uris
                 not needed because this is local store
  
               hive.metastore.local
                true
 
               hive.metastore.warehouse.dir
               <base hdfs path>

2,修改用mysql存储数据源,报
               FAILED: Error in metadata: javax.jdo.JDOFatalDataStoreException: Access denied for user 'root'@'219.239.88.200' (using password: YES)
               NestedThrowables:
               java.sql.SQLException: Access denied for user 'root'@'219.239.88.200' (using password: YES)
               FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask
               这是因为mysql不允许远程访问的问题,执行 grant all on *.*  to 'root'@'%' identified by 'root'; 即可。

3,hive中表的创建以及数据加载,两种方式

1)CREATE EXTERNAL TABLE page(LOGDATE STRING, REFERRER STRING, URL STRING, ARTICLEID STRING

, TYPE STRING, VSCOOKIEID STRING,UVCOOKIEID STRING,IP STRING,OS STRING

,BROWSER STRING,RESOLUTION STRING,COLOR STRING,FLASHVERSION STRING,PARM1 STRING

,PARM2 STRING,PARM3 STRING,ISADD STRING )
            COMMENT 'This is the staging page view table'
             ROW FORMAT DELIMITED FIELDS TERMINATED BY '|~,'
             STORED AS TEXTFILE;

LOAD DATA LOCAL INPATH '/usr/local/hadoop-0.20.203.0/new.log' OVERWRITE INTO TABLE page;

加local是从linux上取文件,不加时是从hdfs上取文件

2)只要把文件put到相应位置即可
                   CREATE EXTERNAL TABLE page_view_stg(viewTime INT, userid BIGINT,
                    page_url STRING, referrer_url STRING,
                    ip STRING COMMENT 'IP Address of the User',
                    country STRING COMMENT 'country of origination')
                    COMMENT 'This is the staging page view table'
                    ROW FORMAT DELIMITED FIELDS TERMINATED BY ',' LINES TERMINATED BY '/n'
                    STORED AS TEXTFILE
                    LOCATION '/user/data/staging/page_view';

hadoop dfs -put /tmp/pv_2008-06-08.txt /user/data/staging/page_view      

其他更多语句用法参考,下次将介绍更多关于基于hadoop的hive应用,请关注~

内容版权声明:除非注明,否则皆为本站原创文章。

转载注明出处:http://www.heiqu.com/c91f82fc76d441aa010d770b32e44cf5.html