Github user chenerlu commented on the issue:
https://github.com/apache/carbondata/pull/1321 please retest this --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1321 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/117/ --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/1321 Build Failed with Spark 1.6, Please check CI http://144.76.159.231:8080/job/ApacheCarbonPRBuilder1/14/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1321 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/122/ --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/1321 Build Failed with Spark 1.6, Please check CI http://144.76.159.231:8080/job/ApacheCarbonPRBuilder1/20/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1321 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/123/ --- |
In reply to this post by qiuchenjian-2
Github user chenerlu commented on the issue:
https://github.com/apache/carbondata/pull/1321 please retest this --- |
In reply to this post by qiuchenjian-2
Github user chenerlu commented on the issue:
https://github.com/apache/carbondata/pull/1321 retest this please --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1321 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/124/ --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/1321 SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/740/ --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/1321 Build Success with Spark 1.6, Please check CI http://144.76.159.231:8080/job/ApacheCarbonPRBuilder1/21/ --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/1321 Build Failed with Spark 1.6, Please check CI http://144.76.159.231:8080/job/ApacheCarbonPRBuilder1/22/ --- |
In reply to this post by qiuchenjian-2
Github user chenliang613 commented on the issue:
https://github.com/apache/carbondata/pull/1321 retest this please --- |
In reply to this post by qiuchenjian-2
Github user chenliang613 commented on the issue:
https://github.com/apache/carbondata/pull/1321 retest this please --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1321 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/128/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1321 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/129/ --- |
In reply to this post by qiuchenjian-2
Github user jackylk commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1321#discussion_r138830621 --- Diff: integration/spark/src/main/scala/org/apache/spark/sql/execution/command/carbonTableSchema.scala --- @@ -172,6 +173,13 @@ case class CreateTable(cm: TableModel) extends RunnableCommand { val tableInfo: TableInfo = TableNewProcessor(cm) + // Add validation for sort scope when create table + val sortScope = tableInfo.getFactTable.getTableProperties.get("sort_scope") + if (null != sortScope && !CarbonUtil.isValidSortOption(sortScope)) { --- End diff -- `null != sortScope` can be removed, it is checked inside `CarbonUtil.isValidSortOption` --- |
In reply to this post by qiuchenjian-2
Github user jackylk commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1321#discussion_r138830655 --- Diff: integration/spark2/src/main/scala/org/apache/spark/sql/execution/command/carbonTableSchema.scala --- @@ -344,6 +345,13 @@ case class CreateTable(cm: TableModel, createDSTable: Boolean = true) extends Ru val tableInfo: TableInfo = TableNewProcessor(cm) + // Add validation for sort scope when create table + val sortScope = tableInfo.getFactTable.getTableProperties.get("sort_scope") + if (null != sortScope && !CarbonUtil.isValidSortOption(sortScope)) { --- End diff -- null != sortScope can be removed, it is checked inside CarbonUtil.isValidSortOption --- |
In reply to this post by qiuchenjian-2
Github user jackylk commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1321#discussion_r138831288 --- Diff: integration/spark/src/main/scala/org/apache/spark/sql/execution/command/carbonTableSchema.scala --- @@ -172,6 +173,13 @@ case class CreateTable(cm: TableModel) extends RunnableCommand { val tableInfo: TableInfo = TableNewProcessor(cm) + // Add validation for sort scope when create table + val sortScope = tableInfo.getFactTable.getTableProperties.get("sort_scope") + if (null != sortScope && !CarbonUtil.isValidSortOption(sortScope)) { + throw new InvalidConfigurationException("The sort scope " + sortScope --- End diff -- suggest to change to ``` s"Passing invalid sort scope '$sortScope', valid sort scopes are 'NO_SORT', 'BATCH_SORT', 'LOCAL_SORT' or 'GLOBAL_SORT' " ``` --- |
In reply to this post by qiuchenjian-2
Github user jackylk commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1321#discussion_r138832339 --- Diff: integration/spark/src/main/scala/org/apache/spark/sql/execution/command/carbonTableSchema.scala --- @@ -432,7 +440,9 @@ case class LoadTable( val dateFormat = options.getOrElse("dateformat", null) ValidateUtil.validateDateFormat(dateFormat, table, tableName) val maxColumns = options.getOrElse("maxcolumns", null) - val sortScope = options.getOrElse("sort_scope", null) + + val tableProperties = table.getTableInfo.getFactTable.getTableProperties + val sortScope = if (null == tableProperties) null else tableProperties.get("sort_scope") --- End diff -- If tableProperties is null, it need to use default sort scope, right? --- |
Free forum by Nabble | Edit this page |