GitHub user QiangCai opened a pull request:
https://github.com/apache/carbondata/pull/2528 [WIP][CarbonStore] Fix task locality issue Be sure to do all of the following checklist to help us incorporate your contribution quickly and easily: - [ ] Any interfaces changed? - [ ] Any backward compatibility impacted? - [ ] Document update required? - [ ] Testing done Please provide details on - Whether new unit test cases have been added or why no new tests are required? - How it is tested? Please attach test report. - Is it a performance related change? Please attach the performance test report. - Any additional information to help reviewers in testing this change. - [ ] For large changes, please consider breaking it into sub-tasks under an umbrella JIRA. You can merge this pull request into a Git repository by running: $ git pull https://github.com/QiangCai/carbondata locality Alternatively you can review and apply these changes as the patch at: https://github.com/apache/carbondata/pull/2528.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #2528 ---- commit 605967630cfd2aa95c64aec0b6b9bcd020a787fe Author: QiangCai <qiangcai@...> Date: 2018-07-19T06:50:38Z fix task locality issue ---- --- |
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2528 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7315/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2528 Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/6079/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2528 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7388/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2528 Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/6149/ --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/2528 SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5962/ --- |
In reply to this post by qiuchenjian-2
Github user jackylk commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2528#discussion_r204612669 --- Diff: core/src/main/java/org/apache/carbondata/core/constants/CarbonCommonConstants.java --- @@ -1882,6 +1882,13 @@ public static final String CARBON_MERGE_INDEX_IN_SEGMENT_DEFAULT = "true"; + /** + * config carbon scan task locality --- End diff -- Please provide more detail, like what scheduling behavior will be used for true and false --- |
In reply to this post by qiuchenjian-2
Github user jackylk commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2528#discussion_r204612752 --- Diff: integration/spark-common/src/main/scala/org/apache/carbondata/spark/rdd/CarbonScanRDD.scala --- @@ -87,6 +87,8 @@ class CarbonScanRDD[T: ClassTag]( } private var vectorReader = false + private val isTaskLocality = CarbonProperties.isTaskLocality --- End diff -- it can be transient --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2528 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7444/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2528 Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/6199/ --- |
In reply to this post by qiuchenjian-2
Github user jackylk commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2528#discussion_r204715388 --- Diff: store/sql/pom.xml --- @@ -35,6 +36,33 @@ </exclusion> </exclusions> </dependency> + <dependency> + <groupId>org.apache.hadoop</groupId> + <artifactId>hadoop-aws</artifactId> + <version>${hadoop.version}</version> + <exclusions> + <exclusion> + <groupId>com.fasterxml.jackson.core</groupId> + <artifactId>*</artifactId> + </exclusion> + </exclusions> + </dependency> + <dependency> + <groupId>com.amazonaws</groupId> + <artifactId>aws-java-sdk-s3</artifactId> + <version> 1.10.6</version> --- End diff -- remove tab --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2528 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7472/ --- |
In reply to this post by qiuchenjian-2
Github user xuchuanyin commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2528#discussion_r204968370 --- Diff: integration/spark-common/src/main/scala/org/apache/carbondata/spark/rdd/CarbonScanRDD.scala --- @@ -739,9 +741,16 @@ class CarbonScanRDD[T: ClassTag]( * Get the preferred locations where to launch this task. */ override def getPreferredLocations(split: Partition): Seq[String] = { - val theSplit = split.asInstanceOf[CarbonSparkPartition] - val firstOptionLocation = theSplit.split.value.getLocations.filter(_ != "localhost") - firstOptionLocation + if (isTaskLocality) { + split.asInstanceOf[CarbonSparkPartition] + .split + .value + .getLocations + .filter(_ != "localhost") --- End diff -- What will happen if I configure TaskLocality and run the job in local machine or local pseudo distributed mode? Besides, if you really want to exclude local machine, except 'localhost', why host name of local machine is not considered? --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2528 Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/6229/ --- |
In reply to this post by qiuchenjian-2
Github user jackylk commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2528#discussion_r205013703 --- Diff: pom.xml --- @@ -110,7 +110,7 @@ <properties> <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding> <snappy.version>1.1.2.6</snappy.version> - <hadoop.version>2.7.2</hadoop.version> + <hadoop.version>2.8.3</hadoop.version> --- End diff -- please do not modify this --- |
In reply to this post by qiuchenjian-2
Github user QiangCai commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2528#discussion_r205022866 --- Diff: integration/spark-common/src/main/scala/org/apache/carbondata/spark/rdd/CarbonScanRDD.scala --- @@ -739,9 +741,16 @@ class CarbonScanRDD[T: ClassTag]( * Get the preferred locations where to launch this task. */ override def getPreferredLocations(split: Partition): Seq[String] = { - val theSplit = split.asInstanceOf[CarbonSparkPartition] - val firstOptionLocation = theSplit.split.value.getLocations.filter(_ != "localhost") - firstOptionLocation + if (isTaskLocality) { + split.asInstanceOf[CarbonSparkPartition] + .split + .value + .getLocations + .filter(_ != "localhost") --- End diff -- It doesn't want to exclude local machine. Because maybe all machines have configured "127.0.0.1 localhost", "localhost" is unuseful. --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2528 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7492/ --- |
In reply to this post by qiuchenjian-2
|
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2528 Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/6253/ --- |
In reply to this post by qiuchenjian-2
|
Free forum by Nabble | Edit this page |