GitHub user dhatchayani opened a pull request:
https://github.com/apache/carbondata/pull/2591 [WIP][CARBONDATA-2808] Insert into select is crashing as both are sharing the same task context Be sure to do all of the following checklist to help us incorporate your contribution quickly and easily: - [ ] Any interfaces changed? - [ ] Any backward compatibility impacted? - [ ] Document update required? - [ ] Testing done Please provide details on - Whether new unit test cases have been added or why no new tests are required? - How it is tested? Please attach test report. - Is it a performance related change? Please attach the performance test report. - Any additional information to help reviewers in testing this change. - [ ] For large changes, please consider breaking it into sub-tasks under an umbrella JIRA. You can merge this pull request into a Git repository by running: $ git pull https://github.com/dhatchayani/carbondata CARBONDATA-2808 Alternatively you can review and apply these changes as the patch at: https://github.com/apache/carbondata/pull/2591.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #2591 ---- commit 73254ac6de65bcf6c8ec9811bf3cf23512b2eb72 Author: dhatchayani <dhatcha.official@...> Date: 2018-07-31T17:41:49Z [CARBONDATA-2808] Insert into select is crashing as both are sharing the same task context ---- --- |
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2591 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7688/ --- |
In reply to this post by qiuchenjian-2
Github user dhatchayani commented on the issue:
https://github.com/apache/carbondata/pull/2591 Retest this please --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2591 Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/6414/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2591 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7691/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2591 Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/6417/ --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/2591 retest sdv please --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2591#discussion_r206733033 --- Diff: integration/spark-common/src/main/scala/org/apache/carbondata/spark/rdd/CarbonScanRDD.scala --- @@ -457,13 +458,25 @@ class CarbonScanRDD[T: ClassTag]( } } + // TODO: rewrite this logic to call free memory in FailureListener on failures. On success, + // no memory leak should be there, resources should be freed on success completion. + // TODO: If CarbonRecordReader and VectorizedCarbonReader are called directly + // from any other callers, then in the caller, resource clearing should be taken care + val listeners = CarbonReflectionUtils.getField("onCompleteCallbacks", context) + .asInstanceOf[ArrayBuffer[TaskCompletionListener]] + val isAdded = listeners.exists(p => p.isInstanceOf[InsertTaskCompletionListener]) // add task completion before calling initialize as initialize method will internally call // for usage of unsafe method for processing of one blocklet and if there is any exception // while doing that the unsafe memory occupied for that task will not get cleared - context.addTaskCompletionListener { _ => - closeReader.apply() - close() - logStatistics(executionId, taskId, queryStartTime, model.getStatisticsRecorder, split) + context.addTaskCompletionListener { new QueryTaskCompletionListener(!isAdded, --- End diff -- Remove the duplicated code which is copied to `QueryTaskCompletionListener` from this class --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/2591 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/6087/ --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/2591 SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/6091/ --- |
In reply to this post by qiuchenjian-2
Github user kumarvishal09 commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2591#discussion_r206780680 --- Diff: core/src/main/java/org/apache/carbondata/core/scan/model/QueryModel.java --- @@ -390,4 +393,12 @@ public String toString() { projection.getDimensions().size() + projection.getMeasures().size(), filterExpressionResolverTree.getFilterExpression().toString()); } + + public boolean isFreeUnsafeMemory() { --- End diff -- change the method name --- |
In reply to this post by qiuchenjian-2
Github user kumarvishal09 commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2591#discussion_r206780781 --- Diff: integration/spark-common/src/main/scala/org/apache/carbondata/spark/rdd/CarbonScanRDD.scala --- @@ -457,17 +458,31 @@ class CarbonScanRDD[T: ClassTag]( } } + // create a statistics recorder + val recorder = CarbonTimeStatisticsFactory.createExecutorRecorder(model.getQueryId()) + model.setStatisticsRecorder(recorder) + + // TODO: rewrite this logic to call free memory in FailureListener on failures. On success, + // no memory leak should be there, resources should be freed on success completion. + val listeners = CarbonReflectionUtils.getField("onCompleteCallbacks", context) + .asInstanceOf[ArrayBuffer[TaskCompletionListener]] + val isAdded = listeners.exists(p => p.isInstanceOf[InsertTaskCompletionListener]) + if (isAdded) { --- End diff -- If condition is not required --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2591 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7696/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2591 Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/6422/ --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/2591 SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/6093/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2591 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7700/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2591 Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/6426/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2591 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7704/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2591 Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/6430/ --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2591#discussion_r206846923 --- Diff: core/src/main/java/org/apache/carbondata/core/scan/model/QueryModel.java --- @@ -390,4 +393,12 @@ public String toString() { projection.getDimensions().size() + projection.getMeasures().size(), filterExpressionResolverTree.getFilterExpression().toString()); } + + public boolean getFreeUnsafeMemory() { --- End diff -- supposed to be istFreeUnsafeMemory --- |
Free forum by Nabble | Edit this page |