lang.IllegalArgumentException: Wrong FS: hdfs://node1:9000/user/hive/warehouse/."/>
hive在执行查询sql时出现java.lang.IllegalArgumentException: Wrong FS: hdfs://node1:9000/user/hive/warehouse/.
hive在执行查询sql时出现java.lang.IllegalArgumentException: Wrong FS: hdfs://node1:9000/user/hive/warehouse/test1.db/t1, expected: hdfs://cluster1
原因是hadoop由普通集群修改成了高可用集群后没有更改hive设置中warehouse在hdfs上的储存路径
修改hive-site.xml文件内hive.metastore.warehouse.dir的值将之前的hdfs://node1:9000/user/hive/warehouse修改为hdfs://cluster1/user/hive/warehouse
(这里的hdfs://cluster1是Hadoop配置文件core-site.xml中的fs.defaultFS指定的值)
<property><name>hive.metastore.warehouse.dir</name><value>hdfs://cluster1/user/hive/warehouse</value><description>location of default database for the warehouse</description></property>
修改完成后重启hive服务测试:
[root@node1 hive]# bin/hive
添加新数据库
hive> create database hadoopha_test.db;
创建成功后通过sqlyog连接到数据库查看hive库下的DBS表中的数据发现
hdfs://cluster1/user/hive/warehouse/hadoopha_test.db
经测试在此库下创建表可以正常查询数据了
如果sparksql整合了hive,在启动sparksql是也要修改指定spark.sql.warehouse.dir的路径为hdfs://cluster1/user/hive/warehouse
bin/spark-sql --master spark://node1:7077 \
--executor-memory 1g \
--total-executor-cores 2 \
--conf spark.sql.warehouse.dir=hdfs://cluster1/user/hive/warehouse
更多推荐
hive在执行查询sql时出现java.lang.IllegalArgumentException: Wrong FS: hdfs://node1:9000/u
发布评论