GitHub user ajantha-bhat opened a pull request:
https://github.com/apache/carbondata/pull/3055 [HOTFIX] presto can't read dictionary include decimal column problem: decimal column with dictionary include cannot be read in presto cause: int is typecasted to decimal for dictionary columns in decimal stream reader. solution: keep original data type as well as new data type for decimal stream reader. Be sure to do all of the following checklist to help us incorporate your contribution quickly and easily: - [ ] Any interfaces changed? NA - [ ] Any backward compatibility impacted? NA - [ ] Document update required? NA - [ ] Testing done. done - [ ] For large changes, please consider breaking it into sub-tasks under an umbrella JIRA. NA You can merge this pull request into a Git repository by running: $ git pull https://github.com/ajantha-bhat/carbondata issue_fix Alternatively you can review and apply these changes as the patch at: https://github.com/apache/carbondata/pull/3055.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #3055 ---- commit 205189d121f80aa87598a6c5f5e34562036c03c5 Author: ajantha-bhat <ajanthabhat@...> Date: 2019-01-07T09:20:11Z dictionary include decimal column type cast issue problem: decimal column with dictionary include cannot be read in presto cause: int is typecasted to decimal for dictionary columns. solution: keep original data type as well as new data type for decimal slice reader ---- --- |
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/3055 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder2.1/2201/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/3055 Build Success with Spark 2.3.2, Please check CI http://136.243.101.176:8080/job/carbondataprbuilder2.3/10458/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/3055 Build Success with Spark 2.2.1, Please check CI http://95.216.28.178:8080/job/ApacheCarbonPRBuilder1/2418/ --- |
In reply to this post by qiuchenjian-2
Github user ajantha-bhat commented on the issue:
https://github.com/apache/carbondata/pull/3055 @ravipesala : please check --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/3055 Build Failed with Spark 2.2.1, Please check CI http://95.216.28.178:8080/job/ApacheCarbonPRBuilder1/2439/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/3055 Build Failed with Spark 2.3.2, Please check CI http://136.243.101.176:8080/job/carbondataprbuilder2.3/10476/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/3055 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder2.1/2219/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/3055 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder2.1/2224/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/3055 Build Success with Spark 2.2.1, Please check CI http://95.216.28.178:8080/job/ApacheCarbonPRBuilder1/2443/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/3055 Build Success with Spark 2.3.2, Please check CI http://136.243.101.176:8080/job/carbondataprbuilder2.3/10481/ --- |
In reply to this post by qiuchenjian-2
Github user ajantha-bhat commented on the issue:
https://github.com/apache/carbondata/pull/3055 @ravipesala : please check --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/3055#discussion_r246277337 --- Diff: integration/presto/src/main/java/org/apache/carbondata/presto/readers/SliceStreamReader.java --- @@ -95,22 +105,14 @@ public SliceStreamReader(int batchSize, DataType dataType, dictOffsets[dictOffsets.length - 1] = size; dictionaryBlock = new VariableWidthBlock(dictionary.getDictionarySize(), Slices.wrappedBuffer(singleArrayDictValues), dictOffsets, Optional.of(nulls)); - values = (int[]) ((CarbonColumnVectorImpl) getDictionaryVector()).getDataArray(); + this.isLocalDict = true; } - @Override public void setBatchSize(int batchSize) { + --- End diff -- remove empty space --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/3055#discussion_r246277431 --- Diff: integration/presto/src/main/java/org/apache/carbondata/presto/readers/SliceStreamReader.java --- @@ -142,5 +144,17 @@ public SliceStreamReader(int batchSize, DataType dataType, @Override public void reset() { builder = type.createBlockBuilder(null, batchSize); + this.isLocalDict = false; + } + + @Override public void putInt(int rowId, int value) { + Object data = DataTypeUtil --- End diff -- Direct overriding does not create problem to local dictionary? how do you handle local dictionary here? --- |
In reply to this post by qiuchenjian-2
Github user ajantha-bhat commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/3055#discussion_r246318548 --- Diff: integration/presto/src/main/java/org/apache/carbondata/presto/readers/SliceStreamReader.java --- @@ -142,5 +144,17 @@ public SliceStreamReader(int batchSize, DataType dataType, @Override public void reset() { builder = type.createBlockBuilder(null, batchSize); + this.isLocalDict = false; + } + + @Override public void putInt(int rowId, int value) { + Object data = DataTypeUtil --- End diff -- putInt() will not be called incase of local dictionary as setDictionary() itself is filling all the values array. Hence no impact with change to local dictionary. Also local dictionary UT are present and running fine after the changes --- |
In reply to this post by qiuchenjian-2
Github user ajantha-bhat commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/3055#discussion_r246319596 --- Diff: integration/presto/src/main/java/org/apache/carbondata/presto/readers/SliceStreamReader.java --- @@ -95,22 +105,14 @@ public SliceStreamReader(int batchSize, DataType dataType, dictOffsets[dictOffsets.length - 1] = size; dictionaryBlock = new VariableWidthBlock(dictionary.getDictionarySize(), Slices.wrappedBuffer(singleArrayDictValues), dictOffsets, Optional.of(nulls)); - values = (int[]) ((CarbonColumnVectorImpl) getDictionaryVector()).getDataArray(); + this.isLocalDict = true; } - @Override public void setBatchSize(int batchSize) { + --- End diff -- done --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/3055 LGTM --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/3055 Build Success with Spark 2.2.1, Please check CI http://95.216.28.178:8080/job/ApacheCarbonPRBuilder1/2458/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/3055 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder2.1/2238/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/3055 Build Success with Spark 2.3.2, Please check CI http://136.243.101.176:8080/job/carbondataprbuilder2.3/10495/ --- |
Free forum by Nabble | Edit this page |