Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2269 Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/4578/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2269 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/5799/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2269 Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/4645/ --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/2269 SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/4852/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2269 Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/4688/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2269 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/5844/ --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/2269 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/4890/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2269 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/5850/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2269 Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/4694/ --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/2269 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/4895/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2269 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/5899/ --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/2269 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/4933/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2269 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/5909/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2269 Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/4755/ --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/2269 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/4942/ --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2269#discussion_r189220225 --- Diff: hadoop/src/main/java/org/apache/carbondata/hadoop/api/DistributableDataMapFormat.java --- @@ -100,14 +103,18 @@ private static FilterResolverIntf getFilterExp(Configuration configuration) thro return new RecordReader<Void, ExtendedBlocklet>() { private Iterator<ExtendedBlocklet> blockletIterator; private ExtendedBlocklet currBlocklet; + private List<DataMap> dataMaps; @Override public void initialize(InputSplit inputSplit, TaskAttemptContext taskAttemptContext) throws IOException, InterruptedException { - DataMapDistributableWrapper distributable = (DataMapDistributableWrapper) inputSplit; - TableDataMap dataMap = DataMapStoreManager.getInstance() + distributable = (DataMapDistributableWrapper) inputSplit; + TableDataMap tableDataMap = DataMapStoreManager.getInstance() .getDataMap(table, distributable.getDistributable().getDataMapSchema()); - List<ExtendedBlocklet> blocklets = dataMap.prune(distributable.getDistributable(), - dataMapExprWrapper.getFilterResolverIntf(distributable.getUniqueId()), partitions); + dataMaps = tableDataMap.getTableDataMaps(distributable.getDistributable()); --- End diff -- I don't see the benefit of get the datamaps out and close it, I think you can close inside datamap when prune is done. I think it is not required to change theinterface for it. --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2269 Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/4813/ --- |
In reply to this post by qiuchenjian-2
Github user akashrn5 commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2269#discussion_r189267374 --- Diff: hadoop/src/main/java/org/apache/carbondata/hadoop/api/DistributableDataMapFormat.java --- @@ -100,14 +103,18 @@ private static FilterResolverIntf getFilterExp(Configuration configuration) thro return new RecordReader<Void, ExtendedBlocklet>() { private Iterator<ExtendedBlocklet> blockletIterator; private ExtendedBlocklet currBlocklet; + private List<DataMap> dataMaps; @Override public void initialize(InputSplit inputSplit, TaskAttemptContext taskAttemptContext) throws IOException, InterruptedException { - DataMapDistributableWrapper distributable = (DataMapDistributableWrapper) inputSplit; - TableDataMap dataMap = DataMapStoreManager.getInstance() + distributable = (DataMapDistributableWrapper) inputSplit; + TableDataMap tableDataMap = DataMapStoreManager.getInstance() .getDataMap(table, distributable.getDistributable().getDataMapSchema()); - List<ExtendedBlocklet> blocklets = dataMap.prune(distributable.getDistributable(), - dataMapExprWrapper.getFilterResolverIntf(distributable.getUniqueId()), partitions); + dataMaps = tableDataMap.getTableDataMaps(distributable.getDistributable()); --- End diff -- on task failure, or if prune fails, then it will not be closed if we try to close after prune, so taskCompletionListener will take care all right? --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2269 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/5971/ --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/2269 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/4995/ --- |
Free forum by Nabble | Edit this page |