问题描述
当我尝试将 CSV 文件从沙箱上的本地 hadoop 加载到配置单元表时,出现以下异常
LOCATION 'hdfs://sandBox-hdp.hortonworks.com:8020/user/maria_dev/practice';
Error: Error while compiling statement: Failed: HiveAccessControlException Permission denied: user [hive] does not have [ALL] privilege on [hdfs://sandBox-hdp.hortonworks.com:8020/user/ma
ria_dev/practice] (state=42000,code=40000)
CREATE TABLE Sales_transactions(
Transaction_date DATE,Product STRING,Price FLOAT,Payment_Type STRING,Name STRING,City STRING,State STRING,Country STRING,Account_Created TIMESTAMP,Last_Login TIMESTAMP,Latitude FLOAT,Longitude FLOAT,Zip STRING
)
ROW FORMAT DELIMITED FIELDS TERMINATED BY ','
STORED AS TEXTFILE
**LOCATION 'hdfs://sandBox-hdp.hortonworks.com:8020/user/maria_dev/practice';** //Error pointing this line.
解决方法
这实际上是两步过程,我认为您错过了第 1 步。(假设您的用户拥有所有适当的访问权限。)
步骤 1 - 将本地文件加载到 hdfs 文件系统中。
hdfs dfs -put /~/Sales_transactions.csv hdfs://sandbox-hdp.hortonworks.com:8020/user/maria_dev/practice`
第 2 步 - 然后将上面的 hdfs 数据加载到表中。
load data inpath 'hdfs://sandbox-hdp.hortonworks.com:8020/user/maria_dev/practice/Sales_transactions.csv' into table myDB.Sales_transactions_table
或者你也可以使用它 -
LOAD DATA LOCAL INPATH '/~/Sales_transactions.csv' INTO TABLE mydb.Sales_transactions_table;