GitHub user rahulforallp opened a pull request:
https://github.com/apache/carbondata/pull/2096 [CARBONDATA-2274] fix for Partition table having more than 4 column giving zero record - [X] Any interfaces changed? NO - [ ] Any backward compatibility impacted? No - [ ] Document update required? No - [ ] Testing done **UT added** - [ ] For large changes, please consider breaking it into sub-tasks under an umbrella JIRA. NA You can merge this pull request into a Git repository by running: $ git pull https://github.com/rahulforallp/incubator-carbondata CARBONDATA-2274 Alternatively you can review and apply these changes as the patch at: https://github.com/apache/carbondata/pull/2096.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #2096 ---- commit cee1e90f3e94c67a14209551f4d08eee9c525797 Author: rahulforallp <rahul.kumar@...> Date: 2018-03-23T14:49:43Z [CARBONDATA-2274]fixed Partition table having more than 4 column giving zero record ---- --- |
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2096 Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/3316/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2096 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/4544/ --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/2096 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/4041/ --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/2096 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/4042/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2096 Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/3371/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2096 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/4597/ --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/2096 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/4094/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2096 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/4601/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2096 Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/3377/ --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/2096 SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/4101/ --- |
In reply to this post by qiuchenjian-2
Github user rahulforallp commented on the issue:
https://github.com/apache/carbondata/pull/2096 retest sdv please --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/2096 SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/4105/ --- |
In reply to this post by qiuchenjian-2
Github user rahulforallp commented on the issue:
https://github.com/apache/carbondata/pull/2096 retest sdv please --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/2096 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/4106/ --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2096#discussion_r177650441 --- Diff: integration/spark-common/src/main/scala/org/apache/carbondata/spark/util/CarbonScalaUtil.scala --- @@ -340,12 +342,12 @@ object CarbonScalaUtil { * @return updated partition spec */ def updatePartitions( - partitionSpec: Map[String, String], - table: CarbonTable): Map[String, String] = { + partitionSpec: mutable.LinkedHashMap[String, String], + table: CarbonTable): mutable.LinkedHashMap[String, String] = { --- End diff -- Formatting is not proper --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2096#discussion_r177650907 --- Diff: integration/spark-common-test/src/test/scala/org/apache/carbondata/spark/testsuite/standardpartition/StandardPartitionTableLoadingTestCase.scala --- @@ -135,6 +135,23 @@ class StandardPartitionTableLoadingTestCase extends QueryTest with BeforeAndAfte sql("select empno, empname, designation, doj, workgroupcategory, workgroupcategoryname, deptno, deptname, projectcode, projectjoindate, projectenddate, attendance, utilization, salary from originTable order by empno")) } + test("data loading for partition table for five partition column") { + sql( + """ + | CREATE TABLE partitionfive (empno int, doj Timestamp, + | workgroupcategoryname String, deptno int, deptname String, + | projectcode int, projectjoindate Timestamp, projectenddate Timestamp,attendance int) + | PARTITIONED BY (utilization int,salary int,workgroupcategory int, empname String, + | designation String) + | STORED BY 'org.apache.carbondata.format' + """.stripMargin) + sql(s"""LOAD DATA local inpath '$resourcesPath/data.csv' INTO TABLE partitionfive OPTIONS('DELIMITER'= ',', 'QUOTECHAR'= '"')""") + + validateDataFiles("default_partitionfive", "0", 10) + + checkAnswer(sql("select empno, empname, designation, doj, workgroupcategory, workgroupcategoryname, deptno, deptname, projectcode, projectjoindate, projectenddate, attendance, utilization, salary from partitionfive order by empno"), --- End diff -- @rahulforallp If this is about partitions are going to wrong folder then better add some filter test case to reproduce and fix it. --- |
In reply to this post by qiuchenjian-2
Github user rahulforallp commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2096#discussion_r177705701 --- Diff: integration/spark-common-test/src/test/scala/org/apache/carbondata/spark/testsuite/standardpartition/StandardPartitionTableLoadingTestCase.scala --- @@ -135,6 +135,23 @@ class StandardPartitionTableLoadingTestCase extends QueryTest with BeforeAndAfte sql("select empno, empname, designation, doj, workgroupcategory, workgroupcategoryname, deptno, deptname, projectcode, projectjoindate, projectenddate, attendance, utilization, salary from originTable order by empno")) } + test("data loading for partition table for five partition column") { + sql( + """ + | CREATE TABLE partitionfive (empno int, doj Timestamp, + | workgroupcategoryname String, deptno int, deptname String, + | projectcode int, projectjoindate Timestamp, projectenddate Timestamp,attendance int) + | PARTITIONED BY (utilization int,salary int,workgroupcategory int, empname String, + | designation String) + | STORED BY 'org.apache.carbondata.format' + """.stripMargin) + sql(s"""LOAD DATA local inpath '$resourcesPath/data.csv' INTO TABLE partitionfive OPTIONS('DELIMITER'= ',', 'QUOTECHAR'= '"')""") + + validateDataFiles("default_partitionfive", "0", 10) + + checkAnswer(sql("select empno, empname, designation, doj, workgroupcategory, workgroupcategoryname, deptno, deptname, projectcode, projectjoindate, projectenddate, attendance, utilization, salary from partitionfive order by empno"), --- End diff -- @ravipesala you mean i should add filter query also in test case? without filter also i was able to reproduce it. --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2096#discussion_r177714407 --- Diff: integration/spark-common-test/src/test/scala/org/apache/carbondata/spark/testsuite/standardpartition/StandardPartitionTableLoadingTestCase.scala --- @@ -135,6 +135,23 @@ class StandardPartitionTableLoadingTestCase extends QueryTest with BeforeAndAfte sql("select empno, empname, designation, doj, workgroupcategory, workgroupcategoryname, deptno, deptname, projectcode, projectjoindate, projectenddate, attendance, utilization, salary from originTable order by empno")) } + test("data loading for partition table for five partition column") { + sql( + """ + | CREATE TABLE partitionfive (empno int, doj Timestamp, + | workgroupcategoryname String, deptno int, deptname String, + | projectcode int, projectjoindate Timestamp, projectenddate Timestamp,attendance int) + | PARTITIONED BY (utilization int,salary int,workgroupcategory int, empname String, + | designation String) + | STORED BY 'org.apache.carbondata.format' + """.stripMargin) + sql(s"""LOAD DATA local inpath '$resourcesPath/data.csv' INTO TABLE partitionfive OPTIONS('DELIMITER'= ',', 'QUOTECHAR'= '"')""") + + validateDataFiles("default_partitionfive", "0", 10) + + checkAnswer(sql("select empno, empname, designation, doj, workgroupcategory, workgroupcategoryname, deptno, deptname, projectcode, projectjoindate, projectenddate, attendance, utilization, salary from partitionfive order by empno"), --- End diff -- ok --- |
In reply to this post by qiuchenjian-2
Github user rahulforallp commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2096#discussion_r177720898 --- Diff: integration/spark-common/src/main/scala/org/apache/carbondata/spark/util/CarbonScalaUtil.scala --- @@ -340,12 +342,12 @@ object CarbonScalaUtil { * @return updated partition spec */ def updatePartitions( - partitionSpec: Map[String, String], - table: CarbonTable): Map[String, String] = { + partitionSpec: mutable.LinkedHashMap[String, String], + table: CarbonTable): mutable.LinkedHashMap[String, String] = { --- End diff -- formatting done --- |
Free forum by Nabble | Edit this page |