> Query carbondata table from presto shows error like 'Partition location does not exist: hdfs://xxx/hive_carbon3-__PLACEHOLDER__'
> --------------------------------------------------------------------------------------------------------------------------------
>
> Key: CARBONDATA-3550
> URL:
https://issues.apache.org/jira/browse/CARBONDATA-3550> Project: CarbonData
> Issue Type: Task
> Affects Versions: 1.6.0
> Reporter: wupeng
> Priority: Major
>
> Hello everyone, I'm a newbee in Carbondata, I read the document and I think you people are doing very create job, Carbondata is awesome!
>
> I'm using Carbondata-1.6.0-rc3, I followed this hive-guide in carbondata official site and I can create carbondata table and load data to it in spark-shell, and I also can query it.
> But when I query the table in presto, the error accurs:
> {code:java}
> presto:default> select * from hive_carbon3;
> Query 20191018_055826_00071_j94n7 failed: Partition location does not exist: hdfs://mycluster/apps/hive/warehouse/hive_carbon3-__PLACEHOLDER__ {code}
> I'm using prestosql-315, so I miragated the presto-carbondata connector from prestodb-0.217.
> {code:java}
> below is how I create table and load data:
> import org.apache.spark.sql.SparkSession
> import org.apache.spark.sql.CarbonSession._
> val rootPath = "hdfs://mycluster/apps/hive"
> val storeLocation = s"$rootPath/store"
> val warehouse = s"$rootPath/warehouse"
> val metaStoreDB = s"$rootPath/metastore_db"
> val carbon = SparkSession.builder().enableHiveSupport().config("spark.sql.warehouse.dir", warehouse).config(org.apache.carbondata.core.constants.CarbonCommonConstants.STORE_LOCATION, storeLocation).getOrCreateCarbonSession(storeLocation, metaStoreDB)
>
> carbon.sql("CREATE TABLE IF NOT EXISTS hive_carbon3(id int, name string, scale decimal, country string, salary double) STORED BY 'carbondata'")
> carbon.sql("LOAD DATA INPATH 'hdfs://mycluster/tmp/sample-hive_carbon.csv' INTO TABLE hive_carbon3")
> carbon.sql("SELECT * FROM hive_carbon3").show()
> {code}
>
> Please help me, thank you very much!