Login  Register

user yarn needs the hdfs access when loading data?

Posted by Li Peng on Dec 27, 2016; 6:51am
URL: http://apache-carbondata-dev-mailing-list-archive.168.s1.nabble.com/user-yarn-needs-the-hdfs-access-when-loading-data-tp5082.html

Hi,
   
   When I use the user "spark" to create table and run spark streaming application.
   I'm confused about why the user "yarn" needs the hdfs access? if so, i can't use spark user to run app, but only use yarn user.



org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied: user=yarn, access=WRITE, inode="/carbondata/carbonstore/default/sale/Metadata/schema":spark:hdfs:drwxr-xr-x



INFO  21-12 11:07:52,389 - ********starting clean up**********
WARN  21-12 11:07:52,442 - Exception while invoking ClientNamenodeProtocolTranslatorPB.delete over dpnode02/192.168.9.2:8020. Not retrying because try once and fail.
org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied: user=yarn, access=WRITE, inode="/carbondata/carbonstore/sale/sale/Fact/Part0/Segment_0":spark:hdfs:drwxr-xr-x
        at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319)
        at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:292)



  Thanks