GitHub user ravipesala opened a pull request:
https://github.com/apache/carbondata/pull/1681 [CARBONDATA-1908][PARTITION] Support UPDATE/DELETE on partition tables. This PR depends on https://github.com/apache/carbondata/pull/1672 and https://github.com/apache/carbondata/pull/1674 and https://github.com/apache/carbondata/pull/1677 This PR supports update/delete for the partition table. It allows writing of update dataframe through CarbonFileFormat to allow partitioning. And handled the update status updation in committer class Be sure to do all of the following checklist to help us incorporate your contribution quickly and easily: - [ ] Any interfaces changed? - [ ] Any backward compatibility impacted? - [ ] Document update required? - [ ] Testing done Please provide details on - Whether new unit test cases have been added or why no new tests are required? - How it is tested? Please attach test report. - Is it a performance related change? Please attach the performance test report. - Any additional information to help reviewers in testing this change. - [ ] For large changes, please consider breaking it into sub-tasks under an umbrella JIRA. You can merge this pull request into a Git repository by running: $ git pull https://github.com/ravipesala/incubator-carbondata partition-IUD Alternatively you can review and apply these changes as the patch at: https://github.com/apache/carbondata/pull/1681.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #1681 ---- commit 8c0673f4a8b526f665087366bf3ac4be9a19e9c0 Author: ravipesala <[hidden email]> Date: 2017-12-18T17:07:59Z Added support to read partitions commit 33f61d756ec49e19ff01e646ccc70af5824a3e8e Author: ravipesala <[hidden email]> Date: 2017-12-16T17:08:00Z Added drop partition feature commit 468c71995bd8c60e9b4819d5deaf527d5d15e3fe Author: ravipesala <[hidden email]> Date: 2017-12-19T07:49:15Z Support insert overwrite partition commit 0fb1f424230ea325d802eadaf63a13cfa2a16bdb Author: ravipesala <[hidden email]> Date: 2017-12-19T12:13:00Z Supported IUD for Partition Tables ---- --- |
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/1681 SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/2420/ --- |
In reply to this post by qiuchenjian-2
Github user gvramana commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1681#discussion_r157828690 --- Diff: integration/spark2/src/main/scala/org/apache/spark/sql/execution/command/management/CarbonLoadDataCommand.scala --- @@ -488,7 +490,24 @@ case class CarbonLoadDataCommand( } InternalRow.fromSeq(data) } - LogicalRDD(attributes, rdd)(sparkSession) + if (updateModel.isDefined) { + sparkSession.sparkContext.setLocalProperty(EXECUTION_ID_KEY, null) + // In case of update, we don't need the segmrntid column in case of partitioning + val dropAttributes = attributes.dropRight(1) + val finalOutput = relation.output.map { attr => + dropAttributes.find { d => + val index = d.name.lastIndexOf("-updatedColumn") --- End diff -- find a better way to get the columns in order from update flow instead of doing string manipulations --- |
In reply to this post by qiuchenjian-2
Github user gvramana commented on the issue:
https://github.com/apache/carbondata/pull/1681 Now the updated new records goes to a new segment for a partitioned table so check the impact of all flows of IUD. --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1681 Build Success with Spark 2.2.0, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/920/ --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/1681 SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/2436/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1681 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/2145/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1681 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/2159/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1681 Build Success with Spark 2.2.0, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/930/ --- |
In reply to this post by qiuchenjian-2
Github user gvramana commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1681#discussion_r158017787 --- Diff: hadoop/src/main/java/org/apache/carbondata/hadoop/api/CarbonTableOutputFormat.java --- @@ -87,6 +87,7 @@ "mapreduce.carbontable.dict.server.host"; public static final String DICTIONARY_SERVER_PORT = "mapreduce.carbontable.dict.server.port"; + public static final String UPADTE_TIMESTAMP = "mapreduce.carbontable.update.timestamp"; --- End diff -- Add comment mentioning usecase of this --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/1681 SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/2458/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1681 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/2186/ --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1681#discussion_r158108990 --- Diff: hadoop/src/main/java/org/apache/carbondata/hadoop/api/CarbonTableOutputFormat.java --- @@ -87,6 +87,7 @@ "mapreduce.carbontable.dict.server.host"; public static final String DICTIONARY_SERVER_PORT = "mapreduce.carbontable.dict.server.port"; + public static final String UPADTE_TIMESTAMP = "mapreduce.carbontable.update.timestamp"; --- End diff -- ok --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1681 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/2193/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1681 Build Failed with Spark 2.2.0, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/970/ --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/1681 SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/2465/ --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1681#discussion_r158197465 --- Diff: integration/spark2/src/main/scala/org/apache/spark/sql/execution/command/management/CarbonLoadDataCommand.scala --- @@ -488,7 +490,24 @@ case class CarbonLoadDataCommand( } InternalRow.fromSeq(data) } - LogicalRDD(attributes, rdd)(sparkSession) + if (updateModel.isDefined) { + sparkSession.sparkContext.setLocalProperty(EXECUTION_ID_KEY, null) + // In case of update, we don't need the segmrntid column in case of partitioning + val dropAttributes = attributes.dropRight(1) + val finalOutput = relation.output.map { attr => + dropAttributes.find { d => + val index = d.name.lastIndexOf("-updatedColumn") --- End diff -- It requires change the order in UpdateCommand, so it impacts the actual flow of IUD. So I guess better handle here --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1681 Build Failed with Spark 2.2.0, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/979/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1681 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/2202/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1681 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/2204/ --- |
Free forum by Nabble | Edit this page |