[jira] [Updated] (CARBONDATA-657) We are not able to create table with shared dictionary columns in spark 2.1

classic Classic list List threaded Threaded
1 message Options
Reply | Threaded
Open this post in threaded view
|

[jira] [Updated] (CARBONDATA-657) We are not able to create table with shared dictionary columns in spark 2.1

Akash R Nilugal (Jira)

     [ https://issues.apache.org/jira/browse/CARBONDATA-657?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Payal updated CARBONDATA-657:
-----------------------------
    Summary: We are not able to create table with shared dictionary columns in spark 2.1  (was: We are not able to create table with shared dictionary columns)

> We are not able to create table with shared dictionary columns in spark 2.1
> ---------------------------------------------------------------------------
>
>                 Key: CARBONDATA-657
>                 URL: https://issues.apache.org/jira/browse/CARBONDATA-657
>             Project: CarbonData
>          Issue Type: Bug
>          Components: sql
>    Affects Versions: 1.0.0-incubating
>         Environment: Spark-2.1
>            Reporter: Payal
>            Priority: Minor
>
> We are not able to create table with shared dictionary columns,it is working fine with spark 1.6 but not working with spark-2.1
> 0: jdbc:hive2://hadoop-master:10000> CREATE TABLE uniq_shared_dictionary (CUST_ID int,CUST_NAME String,ACTIVE_EMUI_VERSION string, DOB timestamp, DOJ timestamp, BIGINT_COLUMN1 bigint,BIGINT_COLUMN2 bigint,DECIMAL_COLUMN1 decimal(30,10), DECIMAL_COLUMN2 decimal(36,10),Double_COLUMN1 double, Double_COLUMN2 double,INTEGER_COLUMN1 int) STORED BY 'org.apache.carbondata.format' TBLPROPERTIES('DICTIONARY_INCLUDE'='CUST_ID,Double_COLUMN2,DECIMAL_COLUMN2','columnproperties.CUST_ID.shared_column'='shared.CUST_ID','columnproperties.decimal_column2.shared_column'='shared.decimal_column2');
> Error: org.apache.carbondata.spark.exception.MalformedCarbonCommandException: Invalid table properties columnproperties.cust_id.shared_column (state=,code=0)
> LOGS
> ERROR 18-01 13:31:18,147 - Error executing query, currentState RUNNING,
> org.apache.carbondata.spark.exception.MalformedCarbonCommandException: Invalid table properties columnproperties.cust_id.shared_column
> at org.apache.carbondata.spark.util.CommonUtil$$anonfun$validateTblProperties$1.apply(CommonUtil.scala:141)
> at org.apache.carbondata.spark.util.CommonUtil$$anonfun$validateTblProperties$1.apply(CommonUtil.scala:137)
> at scala.collection.Iterator$class.foreach(Iterator.scala:893)
> at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
> at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
> at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
> at org.apache.carbondata.spark.util.CommonUtil$.validateTblProperties(CommonUtil.scala:137)
> at org.apache.spark.sql.parser.CarbonSqlAstBuilder.visitCreateTable(CarbonSparkSqlParser.scala:135)
> at org.apache.spark.sql.parser.CarbonSqlAstBuilder.visitCreateTable(CarbonSparkSqlParser.scala:60)
> at org.apache.spark.sql.catalyst.parser.SqlBaseParser$CreateTableContext.accept(SqlBaseParser.java:503)
> at org.antlr.v4.runtime.tree.AbstractParseTreeVisitor.visit(AbstractParseTreeVisitor.java:42)
> at org.apache.spark.sql.catalyst.parser.AstBuilder$$anonfun$visitSingleStatement$1.apply(AstBuilder.scala:66)
> at org.apache.spark.sql.catalyst.parser.AstBuilder$$anonfun$visitSingleStatement$1.apply(AstBuilder.scala:66)
> at org.apache.spark.sql.catalyst.parser.ParserUtils$.withOrigin(ParserUtils.scala:93)
> at org.apache.spark.sql.catalyst.parser.AstBuilder.visitSingleStatement(AstBuilder.scala:65)
> at org.apache.spark.sql.catalyst.parser.AbstractSqlParser$$anonfun$parsePlan$1.apply(ParseDriver.scala:54)
> at org.apache.spark.sql.catalyst.parser.AbstractSqlParser$$anonfun$parsePlan$1.apply(ParseDriver.scala:53)
> at org.apache.spark.sql.catalyst.parser.AbstractSqlParser.parse(ParseDriver.scala:82)
> at org.apache.spark.sql.parser.CarbonSparkSqlParser.parse(CarbonSparkSqlParser.scala:45)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)