Github user xubo245 commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2290#discussion_r187872177 --- Diff: core/src/main/java/org/apache/carbondata/core/datamap/dev/expr/AndDataMapExprWrapper.java --- @@ -59,6 +60,21 @@ public AndDataMapExprWrapper(DataMapExprWrapper left, DataMapExprWrapper right, return andBlocklets; } + @Override + public List<ExtendedBlocklet> prune(DataMapDistributable distributable, --- End diff -- changed --- |
In reply to this post by qiuchenjian-2
Github user xubo245 commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2290#discussion_r187874345 --- Diff: core/src/main/java/org/apache/carbondata/core/datamap/dev/expr/AndDataMapExprWrapper.java --- @@ -59,6 +60,21 @@ public AndDataMapExprWrapper(DataMapExprWrapper left, DataMapExprWrapper right, return andBlocklets; } + @Override + public List<ExtendedBlocklet> prune(DataMapDistributable distributable, + List<PartitionSpec> partitionsToPrune) + throws IOException { + List<ExtendedBlocklet> leftPrune = left.prune(distributable, partitionsToPrune); + List<ExtendedBlocklet> rightPrune = right.prune(distributable, partitionsToPrune); + List<ExtendedBlocklet> andBlocklets = new ArrayList<>(); + for (ExtendedBlocklet blocklet : leftPrune) { + if (rightPrune.contains(blocklet)) { --- End diff -- no use this method of the class. only use the method: org.apache.carbondata.core.datamap.dev.expr.DataMapExprWrapperImpl#prune(org.apache.carbondata.core.datamap.DataMapDistributable, java.util.List<org.apache.carbondata.core.indexstore.PartitionSpec>) --- |
In reply to this post by qiuchenjian-2
Github user xubo245 commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2290#discussion_r187874636 --- Diff: core/src/main/java/org/apache/carbondata/core/datamap/dev/expr/DataMapExprWrapper.java --- @@ -40,6 +41,18 @@ List<ExtendedBlocklet> prune(List<Segment> segments, List<PartitionSpec> partitionsToPrune) throws IOException; + /** + * prune blocklet according distributable + * + * @param distributable distributable + * @param partitionsToPrune partitions to prune + * @return the pruned ExtendedBlocklet list + * @throws IOException + */ + List<ExtendedBlocklet> prune(DataMapDistributable distributable, --- End diff -- ok --- |
In reply to this post by qiuchenjian-2
Github user xubo245 commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2290#discussion_r187875618 --- Diff: core/src/main/java/org/apache/carbondata/core/readcommitter/LatestFilesReadCommittedScope.java --- @@ -40,9 +40,26 @@ public class LatestFilesReadCommittedScope implements ReadCommittedScope { private String carbonFilePath; + private String segmentId; private ReadCommittedIndexFileSnapShot readCommittedIndexFileSnapShot; private LoadMetadataDetails[] loadMetadataDetails; + /** + * a new constructor of this class, which supports obtain lucene index in search mode + * + * @param path carbon file path + * @param segmentId segment id + */ + public LatestFilesReadCommittedScope(String path, String segmentId) { --- End diff -- I don't understand you means. This method adds segmentId for building indexPath. --- |
In reply to this post by qiuchenjian-2
Github user xubo245 commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2290#discussion_r187877018 --- Diff: core/src/main/java/org/apache/carbondata/core/scan/scanner/impl/BlockletFilterScanner.java --- @@ -169,6 +169,10 @@ private BlockletScannedResult executeFilter(RawBlockletColumnChunks rawBlockletC // apply filter on actual data, for each page BitSetGroup bitSetGroup = this.filterExecuter.applyFilter(rawBlockletColumnChunks, useBitSetPipeLine); + // if bitSetGroup is nul, then new BitSetGroup object, which can avoid NPE --- End diff -- It will throw exception when use lucent with search mode. I didn't check other scenario. --- |
In reply to this post by qiuchenjian-2
Github user xubo245 commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2290#discussion_r187878047 --- Diff: datamap/lucene/src/main/java/org/apache/carbondata/datamap/lucene/LuceneDataMapFactoryBase.java --- @@ -168,7 +169,8 @@ public DataMapBuilder createBuilder(Segment segment, String shardName) { getAllIndexDirs(tableIdentifier.getTablePath(), segment.getSegmentNo()); for (CarbonFile indexDir : indexDirs) { // Filter out the tasks which are filtered through CG datamap. - if (!segment.getFilteredIndexShardNames().contains(indexDir.getName())) { + if (getDataMapLevel() != DataMapLevel.FG && --- End diff -- The design of continue is for CG. FG can't skip it by the continue, otherwise it will led to error later. --- |
In reply to this post by qiuchenjian-2
Github user xubo245 commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2290#discussion_r187878321 --- Diff: store/search/src/main/scala/org/apache/spark/rpc/Master.scala --- @@ -234,7 +232,7 @@ class Master(sparkConf: SparkConf) { // if we have enough data already, we do not need to collect more result if (rowCount < globalLimit) { // wait for worker for 10s --- End diff -- ok, thanks. --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2290 Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/4709/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2290 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/5865/ --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/2290 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/4906/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2290 Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/4712/ --- |
In reply to this post by qiuchenjian-2
Github user xubo245 commented on the issue:
https://github.com/apache/carbondata/pull/2290 retest this please --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2290 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/5873/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2290 Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/4720/ --- |
In reply to this post by qiuchenjian-2
Github user xuchuanyin commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2290#discussion_r188127574 --- Diff: core/src/main/java/org/apache/carbondata/core/datamap/dev/expr/AndDataMapExprWrapper.java --- @@ -59,6 +60,21 @@ public AndDataMapExprWrapper(DataMapExprWrapper left, DataMapExprWrapper right, return andBlocklets; } + @Override + public List<ExtendedBlocklet> prune(DataMapDistributable distributable, + List<PartitionSpec> partitionsToPrune) + throws IOException { + List<ExtendedBlocklet> leftPrune = left.prune(distributable, partitionsToPrune); + List<ExtendedBlocklet> rightPrune = right.prune(distributable, partitionsToPrune); + List<ExtendedBlocklet> andBlocklets = new ArrayList<>(); + for (ExtendedBlocklet blocklet : leftPrune) { + if (rightPrune.contains(blocklet)) { --- End diff -- I mean that the ï½rightPrune.contains(blocklet)ï½will internally call the method `equals` in `ExtendBlocklet`... --- |
In reply to this post by qiuchenjian-2
Github user xuchuanyin commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2290#discussion_r188127829 --- Diff: core/src/main/java/org/apache/carbondata/core/datamap/DataMapStoreManager.java --- @@ -90,13 +90,18 @@ private DataMapStoreManager() { String dbName = carbonTable.getDatabaseName(); String tableName = carbonTable.getTableName(); String dmName = dataMap.getDataMapSchema().getDataMapName(); - boolean isDmVisible = sessionInfo.getSessionParams().getProperty( - String.format("%s%s.%s.%s", CarbonCommonConstants.CARBON_DATAMAP_VISIBLE, - dbName, tableName, dmName), "true").trim().equalsIgnoreCase("true"); - if (!isDmVisible) { - LOGGER.warn(String.format("Ignore invisible datamap %s on table %s.%s", - dmName, dbName, tableName)); - dataMapIterator.remove(); + if (sessionInfo != null) { --- End diff -- So the parameters we set in the env will not be used? Such as query on specified segments, invisible datamaps and so on? --- |
In reply to this post by qiuchenjian-2
Github user xuchuanyin commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2290#discussion_r188128386 --- Diff: core/src/main/java/org/apache/carbondata/core/readcommitter/LatestFilesReadCommittedScope.java --- @@ -40,9 +40,26 @@ public class LatestFilesReadCommittedScope implements ReadCommittedScope { private String carbonFilePath; + private String segmentId; private ReadCommittedIndexFileSnapShot readCommittedIndexFileSnapShot; private LoadMetadataDetails[] loadMetadataDetails; + /** + * a new constructor of this class, which supports obtain lucene index in search mode + * + * @param path carbon file path + * @param segmentId segment id + */ + public LatestFilesReadCommittedScope(String path, String segmentId) { --- End diff -- I mean that these code can be reduced to ``` this(path); this.segmentId=segmentId; ``` and you should move the next constructor(Line65) above this constructor. --- |
In reply to this post by qiuchenjian-2
Github user xubo245 commented on the issue:
https://github.com/apache/carbondata/pull/2290 @jackylk @QiangCai Please review it. --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2290 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/5986/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2290 Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/4828/ --- |
Free forum by Nabble | Edit this page |