Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/1446 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/1631/ --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/1446 SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/1633/ --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1446#discussion_r150401328 --- Diff: core/src/main/java/org/apache/carbondata/core/constants/CarbonCommonConstants.java --- @@ -56,6 +56,8 @@ */ public static final String CARBON_INPUT_SEGMENTS = "carbon.input.segments."; + public static final String VALIDATE_CARBON_INPUT_SEGMENTS = "validate.carbon.input.segments."; --- End diff -- What is the use of this property? why is it only required in case of an aggregate table? --- |
In reply to this post by qiuchenjian-2
Github user kunal642 commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1446#discussion_r150401450 --- Diff: core/src/main/java/org/apache/carbondata/core/constants/CarbonCommonConstants.java --- @@ -56,6 +56,8 @@ */ public static final String CARBON_INPUT_SEGMENTS = "carbon.input.segments."; + public static final String VALIDATE_CARBON_INPUT_SEGMENTS = "validate.carbon.input.segments."; --- End diff -- This property is will decide whether the segments have to be looked up and validated or not. For incremental load into pre-aggregate table as only the last segment would be loaded therefore we dont need to lookup or validate the segment that has been set. --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1446#discussion_r150402085 --- Diff: hadoop/src/main/java/org/apache/carbondata/hadoop/api/CarbonTableInputFormat.java --- @@ -300,14 +302,30 @@ private AbsoluteTableIdentifier getAbsoluteTableIdentifier(Configuration configu @Override public List<InputSplit> getSplits(JobContext job) throws IOException { AbsoluteTableIdentifier identifier = getAbsoluteTableIdentifier(job.getConfiguration()); SegmentUpdateStatusManager updateStatusManager = new SegmentUpdateStatusManager(identifier); + CarbonTable carbonTable = getOrCreateCarbonTable(job.getConfiguration()); + if (null == carbonTable) { + throw new IOException("Missing/Corrupt schema file for table."); + } + CarbonSessionInfo carbonSessionInfo = ThreadLocalSessionInfo.getCarbonSessionInfo(); + String aggregateTableSegments = ""; + Boolean validateAggregateTableSegments = true; + if (carbonSessionInfo != null) { + aggregateTableSegments = carbonSessionInfo.getSessionParams().getProperty( + CarbonCommonConstants.CARBON_INPUT_SEGMENTS + identifier.getCarbonTableIdentifier() --- End diff -- Better don't access here, just set the validation required or not from CarbonScanRDD and here use that boolean to skip the validation. --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1446#discussion_r150402090 --- Diff: core/src/main/java/org/apache/carbondata/core/constants/CarbonCommonConstants.java --- @@ -56,6 +56,8 @@ */ public static final String CARBON_INPUT_SEGMENTS = "carbon.input.segments."; + public static final String VALIDATE_CARBON_INPUT_SEGMENTS = "validate.carbon.input.segments."; --- End diff -- Please add the comment of usage here --- |
In reply to this post by qiuchenjian-2
Github user kunal642 commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1446#discussion_r150402648 --- Diff: hadoop/src/main/java/org/apache/carbondata/hadoop/api/CarbonTableInputFormat.java --- @@ -300,14 +302,30 @@ private AbsoluteTableIdentifier getAbsoluteTableIdentifier(Configuration configu @Override public List<InputSplit> getSplits(JobContext job) throws IOException { AbsoluteTableIdentifier identifier = getAbsoluteTableIdentifier(job.getConfiguration()); SegmentUpdateStatusManager updateStatusManager = new SegmentUpdateStatusManager(identifier); + CarbonTable carbonTable = getOrCreateCarbonTable(job.getConfiguration()); + if (null == carbonTable) { + throw new IOException("Missing/Corrupt schema file for table."); + } + CarbonSessionInfo carbonSessionInfo = ThreadLocalSessionInfo.getCarbonSessionInfo(); + String aggregateTableSegments = ""; + Boolean validateAggregateTableSegments = true; + if (carbonSessionInfo != null) { + aggregateTableSegments = carbonSessionInfo.getSessionParams().getProperty( + CarbonCommonConstants.CARBON_INPUT_SEGMENTS + identifier.getCarbonTableIdentifier() --- End diff -- Moved to CarbonScanRDD --- |
In reply to this post by qiuchenjian-2
Github user kunal642 commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1446#discussion_r150402650 --- Diff: core/src/main/java/org/apache/carbondata/core/constants/CarbonCommonConstants.java --- @@ -56,6 +56,8 @@ */ public static final String CARBON_INPUT_SEGMENTS = "carbon.input.segments."; + public static final String VALIDATE_CARBON_INPUT_SEGMENTS = "validate.carbon.input.segments."; --- End diff -- Added comment --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1446 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/1020/ --- |
In reply to this post by qiuchenjian-2
Github user kunal642 commented on the issue:
https://github.com/apache/carbondata/pull/1446 retest this please --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1446 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/1021/ --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/1446 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/1638/ --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/1446 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/1639/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1446 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/1022/ --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1446#discussion_r150406325 --- Diff: integration/spark2/src/main/scala/org/apache/spark/sql/hive/CarbonAnalysisRules.scala --- @@ -67,16 +76,54 @@ object CarbonPreInsertionCasts extends Rule[LogicalPlan] { case attr => attr } } - val newChild: LogicalPlan = if (newChildOutput == child.output) { + val newChild: LogicalPlan = if (newChildOutput == childPlan.output) { p.child } else { - Project(newChildOutput, child) + Project(newChildOutput, childPlan) } InsertIntoCarbonTable(relation, p.partition, newChild, p.overwrite, p.ifNotExists) } else { sys.error("Cannot insert into target table because column number are different") } } + + /** + * Transform the logical plan with average(col1) aggregation type to sum(col1) and count(col1). + * + * @param logicalPlan + * @return + */ + private def transformAggregatePlan(logicalPlan: LogicalPlan): LogicalPlan = { --- End diff -- This needs to be moved preaggregate analysis rules, anyway, we can move it in query PR. That PR already has separate class --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/1446 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/1640/ --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/1446 LGTM --- |
In reply to this post by qiuchenjian-2
|
Free forum by Nabble | Edit this page |