Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/1601 SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/2058/ --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/1601 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/2059/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1601 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/1671/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1601 Build Failed with Spark 2.2, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/398/ --- |
In reply to this post by qiuchenjian-2
Github user geetikagupta16 commented on the issue:
https://github.com/apache/carbondata/pull/1601 retest this please --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1601 Build Success with Spark 2.2.0, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/407/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1601 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/1679/ --- |
In reply to this post by qiuchenjian-2
Github user jackylk commented on the issue:
https://github.com/apache/carbondata/pull/1601 Please describe this PR, what validation is added? --- |
In reply to this post by qiuchenjian-2
Github user geetikagupta16 commented on the issue:
https://github.com/apache/carbondata/pull/1601 @jackylk In this PR, I have added validation for the options in **TBLPROPERTIES** for **CREATE TABLE** command. So whenever an invalid option comes, it will throw an error. --- |
In reply to this post by qiuchenjian-2
Github user kunal642 commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1601#discussion_r155712770 --- Diff: integration/spark2/src/main/scala/org/apache/spark/sql/parser/CarbonSparkSqlParser.scala --- @@ -232,6 +233,30 @@ class CarbonHelperSqlAstBuilder(conf: SQLConf, parser: CarbonSpark2SqlParser) CarbonCreateTableCommand(tableModel, tablePath) } + private def validateTableProperties(properties: Map[String, String]): Map[String, String] = { + var isSupported = true + val invalidOptions = StringBuilder.newBuilder + val tableProperties = Seq("DICTIONARY_INCLUDE", "DICTIONARY_EXCLUDE", "NO_INVERTED_INDEX", + "SORT_COLUMNS", "TABLE_BLOCKSIZE", "STREAMING", "SORT_SCOPE", "COMMENT", "PARTITION_TYPE", + "NUM_PARTITIONS", "RANGE_INFO", "LIST_INFO", "BUCKETNUMBER", "BUCKETCOLUMNS", "TABLENAME") + val tblProperties: Map[String, String] = properties.filter { property => + if (!(tableProperties.exists(prop => prop.equalsIgnoreCase(property._1)) --- End diff -- we can just iterate over the map and check if any of the properties are invalid. No need to create new tblProperties. Instead of filter, use collect to get invalid properties and check if any property is returned. --- |
In reply to this post by qiuchenjian-2
Github user kunal642 commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1601#discussion_r155713494 --- Diff: integration/spark2/src/main/scala/org/apache/spark/sql/parser/CarbonSparkSqlParser.scala --- @@ -190,7 +191,7 @@ class CarbonHelperSqlAstBuilder(conf: SQLConf, parser: CarbonSpark2SqlParser) } val tableProperties = mutable.Map[String, String]() - properties.foreach{property => tableProperties.put(property._1, property._2)} + validatedProperties.foreach{property => tableProperties.put(property._1, property._2)} --- End diff -- i think we can remove this code. After this we will extract the values so no point in creating mutable.Map. Can you please confirm this? --- |
In reply to this post by qiuchenjian-2
Github user kunal642 commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1601#discussion_r155713651 --- Diff: integration/spark-common-cluster-test/src/test/scala/org/apache/carbondata/cluster/sdv/generated/V3offheapvectorTestCase.scala --- @@ -35,7 +35,7 @@ class V3offheapvectorTestCase extends QueryTest with BeforeAndAfterAll { //Check query reponse for select * query with no filters test("V3_01_Query_01_033", Include) { dropTable("3lakh_uniqdata") - sql(s"""CREATE TABLE 3lakh_uniqdata (CUST_ID int,CUST_NAME String,ACTIVE_EMUI_VERSION string, DOB timestamp, DOJ timestamp, BIGINT_COLUMN1 bigint,BIGINT_COLUMN2 bigint,DECIMAL_COLUMN1 decimal(30,10), DECIMAL_COLUMN2 decimal(36,10),Double_COLUMN1 double, Double_COLUMN2 double,INTEGER_COLUMN1 int) STORED BY 'carbondata' TBLPROPERTIES('table_blocksize'='128','include_dictionary'='BIGINT_COLUMN1,BIGINT_COLUMN2,DECIMAL_COLUMN1,DECIMAL_COLUMN2,Double_COLUMN1,Double_COLUMN2,INTEGER_COLUMN1,CUST_ID')""").collect + sql(s"""CREATE TABLE 3lakh_uniqdata (CUST_ID int,CUST_NAME String,ACTIVE_EMUI_VERSION string, DOB timestamp, DOJ timestamp, BIGINT_COLUMN1 bigint,BIGINT_COLUMN2 bigint,DECIMAL_COLUMN1 decimal(30,10), DECIMAL_COLUMN2 decimal(36,10),Double_COLUMN1 double, Double_COLUMN2 double,INTEGER_COLUMN1 int) STORED BY 'carbondata' TBLPROPERTIES('table_blocksize'='128','dictionary_include'='BIGINT_COLUMN1,BIGINT_COLUMN2,DECIMAL_COLUMN1,DECIMAL_COLUMN2,Double_COLUMN1,Double_COLUMN2,INTEGER_COLUMN1,CUST_ID')""").collect --- End diff -- revert the unnecessary changes --- |
In reply to this post by qiuchenjian-2
Github user geetikagupta16 commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1601#discussion_r155718131 --- Diff: integration/spark-common-cluster-test/src/test/scala/org/apache/carbondata/cluster/sdv/generated/V3offheapvectorTestCase.scala --- @@ -35,7 +35,7 @@ class V3offheapvectorTestCase extends QueryTest with BeforeAndAfterAll { //Check query reponse for select * query with no filters test("V3_01_Query_01_033", Include) { dropTable("3lakh_uniqdata") - sql(s"""CREATE TABLE 3lakh_uniqdata (CUST_ID int,CUST_NAME String,ACTIVE_EMUI_VERSION string, DOB timestamp, DOJ timestamp, BIGINT_COLUMN1 bigint,BIGINT_COLUMN2 bigint,DECIMAL_COLUMN1 decimal(30,10), DECIMAL_COLUMN2 decimal(36,10),Double_COLUMN1 double, Double_COLUMN2 double,INTEGER_COLUMN1 int) STORED BY 'carbondata' TBLPROPERTIES('table_blocksize'='128','include_dictionary'='BIGINT_COLUMN1,BIGINT_COLUMN2,DECIMAL_COLUMN1,DECIMAL_COLUMN2,Double_COLUMN1,Double_COLUMN2,INTEGER_COLUMN1,CUST_ID')""").collect + sql(s"""CREATE TABLE 3lakh_uniqdata (CUST_ID int,CUST_NAME String,ACTIVE_EMUI_VERSION string, DOB timestamp, DOJ timestamp, BIGINT_COLUMN1 bigint,BIGINT_COLUMN2 bigint,DECIMAL_COLUMN1 decimal(30,10), DECIMAL_COLUMN2 decimal(36,10),Double_COLUMN1 double, Double_COLUMN2 double,INTEGER_COLUMN1 int) STORED BY 'carbondata' TBLPROPERTIES('table_blocksize'='128','dictionary_include'='BIGINT_COLUMN1,BIGINT_COLUMN2,DECIMAL_COLUMN1,DECIMAL_COLUMN2,Double_COLUMN1,Double_COLUMN2,INTEGER_COLUMN1,CUST_ID')""").collect --- End diff -- In this I have changed the TBLPROPERTIES option dictionary_include --- |
In reply to this post by qiuchenjian-2
Github user kunal642 commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1601#discussion_r155719835 --- Diff: integration/spark-common-cluster-test/src/test/scala/org/apache/carbondata/cluster/sdv/generated/V3offheapvectorTestCase.scala --- @@ -35,7 +35,7 @@ class V3offheapvectorTestCase extends QueryTest with BeforeAndAfterAll { //Check query reponse for select * query with no filters test("V3_01_Query_01_033", Include) { dropTable("3lakh_uniqdata") - sql(s"""CREATE TABLE 3lakh_uniqdata (CUST_ID int,CUST_NAME String,ACTIVE_EMUI_VERSION string, DOB timestamp, DOJ timestamp, BIGINT_COLUMN1 bigint,BIGINT_COLUMN2 bigint,DECIMAL_COLUMN1 decimal(30,10), DECIMAL_COLUMN2 decimal(36,10),Double_COLUMN1 double, Double_COLUMN2 double,INTEGER_COLUMN1 int) STORED BY 'carbondata' TBLPROPERTIES('table_blocksize'='128','include_dictionary'='BIGINT_COLUMN1,BIGINT_COLUMN2,DECIMAL_COLUMN1,DECIMAL_COLUMN2,Double_COLUMN1,Double_COLUMN2,INTEGER_COLUMN1,CUST_ID')""").collect + sql(s"""CREATE TABLE 3lakh_uniqdata (CUST_ID int,CUST_NAME String,ACTIVE_EMUI_VERSION string, DOB timestamp, DOJ timestamp, BIGINT_COLUMN1 bigint,BIGINT_COLUMN2 bigint,DECIMAL_COLUMN1 decimal(30,10), DECIMAL_COLUMN2 decimal(36,10),Double_COLUMN1 double, Double_COLUMN2 double,INTEGER_COLUMN1 int) STORED BY 'carbondata' TBLPROPERTIES('table_blocksize'='128','dictionary_include'='BIGINT_COLUMN1,BIGINT_COLUMN2,DECIMAL_COLUMN1,DECIMAL_COLUMN2,Double_COLUMN1,Double_COLUMN2,INTEGER_COLUMN1,CUST_ID')""").collect --- End diff -- okay --- |
In reply to this post by qiuchenjian-2
Github user geetikagupta16 commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1601#discussion_r155735141 --- Diff: integration/spark2/src/main/scala/org/apache/spark/sql/parser/CarbonSparkSqlParser.scala --- @@ -190,7 +191,7 @@ class CarbonHelperSqlAstBuilder(conf: SQLConf, parser: CarbonSpark2SqlParser) } val tableProperties = mutable.Map[String, String]() - properties.foreach{property => tableProperties.put(property._1, property._2)} + validatedProperties.foreach{property => tableProperties.put(property._1, property._2)} --- End diff -- Removing the mutable Map will lead to changes in the parameters of other methods. Should I work on that? --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1601 Build Success with Spark 2.2.0, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/618/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1601 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/1846/ --- |
In reply to this post by qiuchenjian-2
Github user geetikagupta16 commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1601#discussion_r155749994 --- Diff: integration/spark2/src/main/scala/org/apache/spark/sql/parser/CarbonSparkSqlParser.scala --- @@ -190,7 +191,7 @@ class CarbonHelperSqlAstBuilder(conf: SQLConf, parser: CarbonSpark2SqlParser) } val tableProperties = mutable.Map[String, String]() - properties.foreach{property => tableProperties.put(property._1, property._2)} + validatedProperties.foreach{property => tableProperties.put(property._1, property._2)} --- End diff -- Can't remove the Mutable Map as it is being updated in prepareTableModel method --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/1601 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/2215/ --- |
In reply to this post by qiuchenjian-2
Github user kunal642 commented on the issue:
https://github.com/apache/carbondata/pull/1601 We will not validate the create tbl properties as the user can define his own properties as well. Please close this --- |
Free forum by Nabble | Edit this page |