Babulal created CARBONDATA-2841:
-----------------------------------
Summary: S3 Not working for Update/Delete Command
Key: CARBONDATA-2841
URL:
https://issues.apache.org/jira/browse/CARBONDATA-2841 Project: CarbonData
Issue Type: Bug
Reporter: Babulal
configure in spark defauult.conf
spark.hadoop.fs.s3a.endpoint
spark.hadoop.fs.s3a.secret.key
spark.hadoop.fs.s3a.access.key
Run Update /Delete Command
Caused by: java.io.InterruptedIOException: doesBucketExist on carbon: com.amazonaws.AmazonClientException: No AWS Credentials provided by BasicAWSCredentialsProvider EnvironmentVariableCredentialsProvider SharedInstanceProfileCredentialsProvider : com.amazonaws.AmazonClientException: Unable to load credentials from Amazon EC2 metadata service
at org.apache.hadoop.fs.s3a.S3AUtils.translateException(S3AUtils.java:125)
at org.apache.hadoop.fs.s3a.S3AFileSystem.verifyBucketExists(S3AFileSystem.java:288)
at org.apache.hadoop.fs.s3a.S3AFileSystem.initialize(S3AFileSystem.java:236)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2838)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:100)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2875)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2857)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:389)
at org.apache.hadoop.fs.Path.getFileSystem(Path.java:356)
at org.apache.carbondata.core.datastore.filesystem.AbstractDFSCarbonFile.getDataOutputStream(AbstractDFSCarbonFile.java:383)
at org.apache.carbondata.core.datastore.impl.FileFactory.getDataOutputStream(FileFactory.java:150)
at org.apache.carbondata.core.util.CarbonUtil.copyLocalFileToCarbonStore(CarbonUtil.java:2821)
at org.apache.carbondata.core.util.CarbonUtil.copyCarbonDataFileToCarbonStorePath(CarbonUtil.java:2790)
... 13 more
Caused by: com.amazonaws.AmazonClientException: No AWS Credentials provided by BasicAWSCredentialsProvider Environ
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)