Github user ajantha-bhat commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2966#discussion_r241642755 --- Diff: store/sdk/src/main/java/org/apache/carbondata/sdk/file/CarbonWriterBuilder.java --- @@ -501,6 +502,17 @@ public CarbonLoadModel buildLoadModel(Schema carbonSchema) } // for the longstring field, change the datatype from string to varchar this.schema = updateSchemaFields(carbonSchema, longStringColumns); + if (sortColumns != null && sortColumns.length != 0) { + if (options == null || options.get("sort_scope") == null) { + // If sort_columns are specified and sort_scope is not specified, + // change sort scope to local_sort as now by default sort scope is no_sort. + if (options == null) { + options = new HashMap<>(); + } + //TODO: add in carbon property instead of load options --- End diff -- there are test cases, that just set sort scope in carbon property but not sort columns, that test case fails as table property that I set has higher priority than carbon properties --- |
In reply to this post by qiuchenjian-2
Github user ajantha-bhat commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2966#discussion_r241642790 --- Diff: integration/spark2/src/test/scala/org/apache/spark/carbondata/restructure/AlterTableValidationTestCase.scala --- @@ -523,7 +523,8 @@ class AlterTableValidationTestCase extends Spark2QueryTest with BeforeAndAfterAl } } - test("describe formatted for default sort_columns pre and post alter") { + // after changing default sort_scope to no_sort, all dimensions are not selected for sorting. + ignore("describe formatted for default sort_columns pre and post alter") { --- End diff -- ok. --- |
In reply to this post by qiuchenjian-2
Github user ajantha-bhat commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2966#discussion_r241643126 --- Diff: store/sdk/src/main/java/org/apache/carbondata/sdk/file/CarbonWriterBuilder.java --- @@ -501,6 +502,17 @@ public CarbonLoadModel buildLoadModel(Schema carbonSchema) } // for the longstring field, change the datatype from string to varchar this.schema = updateSchemaFields(carbonSchema, longStringColumns); + if (sortColumns != null && sortColumns.length != 0) { + if (options == null || options.get("sort_scope") == null) { + // If sort_columns are specified and sort_scope is not specified, + // change sort scope to local_sort as now by default sort scope is no_sort. + if (options == null) { + options = new HashMap<>(); + } + //TODO: add in carbon property instead of load options --- End diff -- example, sort_columns are mentioned in table properties, but sort_scope is present as "Batch_sort" in carbon properties, so in this time If I set in table properties, test case fails. that too they set carbon properties after creating the table ! --- |
In reply to this post by qiuchenjian-2
Github user ajantha-bhat commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2966#discussion_r241646014 --- Diff: integration/spark-common/src/main/scala/org/apache/spark/sql/execution/command/carbonTableSchemaCommon.scala --- @@ -848,6 +848,19 @@ class TableNewProcessor(cm: TableModel) { tableSchema.getTableId, cm.databaseNameOp.getOrElse("default")) tablePropertiesMap.put("bad_record_path", badRecordsPath) + if (tablePropertiesMap.get("sort_columns") != null) { + val sortCol = tablePropertiesMap.get("sort_columns") + if ((!sortCol.trim.isEmpty) && tablePropertiesMap.get("sort_scope") == null) { + // If sort_scope is not specified, but sort_columns are present, set sort_scope as + // local_sort in carbon_properties (cannot add in table properties as if user sets carbon + // properties it won't be reflected as table properties is given higher priority) + if (CarbonProperties.getInstance().getProperty(CarbonCommonConstants.LOAD_SORT_SCOPE) == + null) { + CarbonProperties.getInstance() + .addProperty(CarbonCommonConstants.LOAD_SORT_SCOPE, "LOCAL_SORT") --- End diff -- there are test cases, that just set sort scope in carbon property but not sort columns, that test case fails as table property that I set has higher priority than carbon properties example, sort_columns are mentioned in table properties, but sort_scope is present as "Batch_sort" in carbon properties, so in this time If I set in table properties, test case fails. that too they set carbon properties after creating the table ! --- |
In reply to this post by qiuchenjian-2
Github user ajantha-bhat commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2966#discussion_r241646130 --- Diff: core/src/main/java/org/apache/carbondata/core/scan/filter/FilterUtil.java --- @@ -2265,7 +2265,8 @@ public static int compareValues(byte[] filterValue, byte[] minMaxBytes, defaultValue = FilterUtil .getMaskKey(key, currentBlockDimension, segmentProperties.getSortColumnsGenerator()); } else { - defaultValue = ByteUtil.toXorBytes(key); + defaultValue = FilterUtil + .getMaskKey(key, currentBlockDimension, segmentProperties.getDimensionKeyGenerator()); --- End diff -- not same code, one is sortColumnsGenerator and other is DimensionKeyGenerator --- |
In reply to this post by qiuchenjian-2
Github user ajantha-bhat commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2966#discussion_r241647087 --- Diff: core/src/main/java/org/apache/carbondata/core/util/DataTypeUtil.java --- @@ -61,7 +61,35 @@ } }; - private static final ThreadLocal<DateFormat> dateformatter = new ThreadLocal<DateFormat>() { + private static ThreadLocal<String> timeStampformatterString = new ThreadLocal<String>() { + @Override protected String initialValue() { + return CarbonProperties.getInstance() + .getProperty(CarbonCommonConstants.CARBON_TIMESTAMP_FORMAT, + CarbonCommonConstants.CARBON_TIMESTAMP_DEFAULT_FORMAT); + } + }; + + private static void updateTimeStamp() { --- End diff -- This fix is related to [CARBONDATA-3163] After making empty sort_columns with no_sort scope as default. FilterProcessorTestCase.test("Between filter") this was because thread local timeformat was not updated for second table, when two table has different format. --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2966 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder2.1/1753/ --- |
In reply to this post by qiuchenjian-2
Github user ajantha-bhat commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2966#discussion_r241656526 --- Diff: store/sdk/src/main/java/org/apache/carbondata/sdk/file/CarbonWriterBuilder.java --- @@ -501,6 +502,17 @@ public CarbonLoadModel buildLoadModel(Schema carbonSchema) } // for the longstring field, change the datatype from string to varchar this.schema = updateSchemaFields(carbonSchema, longStringColumns); + if (sortColumns != null && sortColumns.length != 0) { + if (options == null || options.get("sort_scope") == null) { + // If sort_columns are specified and sort_scope is not specified, + // change sort scope to local_sort as now by default sort scope is no_sort. + if (options == null) { + options = new HashMap<>(); + } + //TODO: add in carbon property instead of load options --- End diff -- ok. changed back to table properties --- |
In reply to this post by qiuchenjian-2
Github user ajantha-bhat commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2966#discussion_r241656542 --- Diff: integration/spark2/src/test/scala/org/apache/spark/carbondata/restructure/AlterTableValidationTestCase.scala --- @@ -523,7 +523,8 @@ class AlterTableValidationTestCase extends Spark2QueryTest with BeforeAndAfterAl } } - test("describe formatted for default sort_columns pre and post alter") { + // after changing default sort_scope to no_sort, all dimensions are not selected for sorting. + ignore("describe formatted for default sort_columns pre and post alter") { --- End diff -- ok. changed back to table properties --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2966 Build Success with Spark 2.2.1, Please check CI http://95.216.28.178:8080/job/ApacheCarbonPRBuilder1/1964/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2966 Build Success with Spark 2.3.2, Please check CI http://136.243.101.176:8080/job/carbondataprbuilder2.3/10012/ --- |
In reply to this post by qiuchenjian-2
Github user ajantha-bhat commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2966#discussion_r241676753 --- Diff: integration/spark2/src/test/scala/org/apache/spark/carbondata/restructure/AlterTableValidationTestCase.scala --- @@ -523,7 +523,8 @@ class AlterTableValidationTestCase extends Spark2QueryTest with BeforeAndAfterAl } } - test("describe formatted for default sort_columns pre and post alter") { + // after changing default sort_scope to no_sort, all dimensions are not selected for sorting. + ignore("describe formatted for default sort_columns pre and post alter") { --- End diff -- Already one more just below tet case is doing same, so ignored this one. --- |
In reply to this post by qiuchenjian-2
Github user ajantha-bhat commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2966#discussion_r241677372 --- Diff: integration/spark-common/src/main/scala/org/apache/spark/sql/execution/command/carbonTableSchemaCommon.scala --- @@ -848,6 +848,19 @@ class TableNewProcessor(cm: TableModel) { tableSchema.getTableId, cm.databaseNameOp.getOrElse("default")) tablePropertiesMap.put("bad_record_path", badRecordsPath) + if (tablePropertiesMap.get("sort_columns") != null) { + val sortCol = tablePropertiesMap.get("sort_columns") + if ((!sortCol.trim.isEmpty) && tablePropertiesMap.get("sort_scope") == null) { + // If sort_scope is not specified, but sort_columns are present, set sort_scope as + // local_sort in carbon_properties (cannot add in table properties as if user sets carbon + // properties it won't be reflected as table properties is given higher priority) + if (CarbonProperties.getInstance().getProperty(CarbonCommonConstants.LOAD_SORT_SCOPE) == + null) { + CarbonProperties.getInstance() + .addProperty(CarbonCommonConstants.LOAD_SORT_SCOPE, "LOCAL_SORT") --- End diff -- ok. changed back to table properties --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2966 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder2.1/1763/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2966 Build Failed with Spark 2.2.1, Please check CI http://95.216.28.178:8080/job/ApacheCarbonPRBuilder1/1975/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2966 Build Failed with Spark 2.3.2, Please check CI http://136.243.101.176:8080/job/carbondataprbuilder2.3/10024/ --- |
In reply to this post by qiuchenjian-2
Github user ajantha-bhat commented on the issue:
https://github.com/apache/carbondata/pull/2966 retest this please --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2966 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder2.1/1783/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2966 Build Failed with Spark 2.2.1, Please check CI http://95.216.28.178:8080/job/ApacheCarbonPRBuilder1/1996/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2966 Build Failed with Spark 2.3.2, Please check CI http://136.243.101.176:8080/job/carbondataprbuilder2.3/10043/ --- |
Free forum by Nabble | Edit this page |