GitHub user xuchuanyin opened a pull request:
https://github.com/apache/carbondata/pull/2463 [CARBONDATA-2708][BloomDataMap] clear index file in case of data load failure Be sure to do all of the following checklist to help us incorporate your contribution quickly and easily: - [ ] Any interfaces changed? - [ ] Any backward compatibility impacted? - [ ] Document update required? - [ ] Testing done Please provide details on - Whether new unit test cases have been added or why no new tests are required? - How it is tested? Please attach test report. - Is it a performance related change? Please attach the performance test report. - Any additional information to help reviewers in testing this change. - [ ] For large changes, please consider breaking it into sub-tasks under an umbrella JIRA. You can merge this pull request into a Git repository by running: $ git pull https://github.com/xuchuanyin/carbondata CARBONDATA_2708_clear_index_file_load_fail Alternatively you can review and apply these changes as the patch at: https://github.com/apache/carbondata/pull/2463.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #2463 ---- commit 6ae51a9754cbfdbbaace06cd282af95ca24bf006 Author: xuchuanyin <xuchuanyin@...> Date: 2018-07-09T03:36:13Z clear bloom index files after segment is deleted clear bloom index files after corresponding segment is deleted and cleaned commit f848ba282d57e48300ee3a1e1efa1283298c03ee Author: xuchuanyin <xuchuanyin@...> Date: 2018-07-09T07:56:51Z clear index file in case of data load failure ---- --- |
Github user xuchuanyin commented on the issue:
https://github.com/apache/carbondata/pull/2463 This PR depends on PR #2461 --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2463 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/6943/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2463 Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/5726/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2463 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/6954/ --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/2463 SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5716/ --- |
In reply to this post by qiuchenjian-2
Github user xuchuanyin commented on the issue:
https://github.com/apache/carbondata/pull/2463 retest this please --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2463 Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/5736/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2463 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/6966/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2463 Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/5746/ --- |
In reply to this post by qiuchenjian-2
Github user xuchuanyin commented on the issue:
https://github.com/apache/carbondata/pull/2463 retest sdv please --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/2463 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5750/ --- |
In reply to this post by qiuchenjian-2
Github user jackylk commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2463#discussion_r201586698 --- Diff: integration/spark2/src/main/scala/org/apache/carbondata/spark/rdd/CarbonDataRDDFactory.scala --- @@ -588,6 +591,21 @@ object CarbonDataRDDFactory { } } + /** + * clear datamap files for segment + */ + private def clearDataMapFiles(carbonTable: CarbonTable, segmentId: String): Unit = { + try { + DataMapStoreManager.getInstance().getAllDataMap(carbonTable).asScala + .filter(_.getDataMapSchema.isIndexDataMap).foreach { p => + p.deleteDatamapData(List(new Segment(segmentId)).asJava) --- End diff -- move `List(new Segment(segmentId)).asJava` before the loop and move `foreach` to next line, to make it better formatted --- |
In reply to this post by qiuchenjian-2
Github user jackylk commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2463#discussion_r201586829 --- Diff: datamap/lucene/src/main/java/org/apache/carbondata/datamap/lucene/LuceneDataMapFactoryBase.java --- @@ -270,6 +260,22 @@ public void clear() { } + @Override public void deleteDatamapData(Segment segment) { --- End diff -- move @Override to previous line --- |
In reply to this post by qiuchenjian-2
Github user jackylk commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2463#discussion_r201590879 --- Diff: datamap/lucene/src/main/java/org/apache/carbondata/datamap/lucene/LuceneDataMapFactoryBase.java --- @@ -270,6 +260,22 @@ public void clear() { } + @Override public void deleteDatamapData(Segment segment) { + try { + String segmentId = segment.getSegmentNo(); + String datamapPath = CarbonTablePath + .getDataMapStorePath(tableIdentifier.getTablePath(), segmentId, dataMapName); + if (FileFactory.isFileExist(datamapPath)) { + CarbonFile file = FileFactory.getCarbonFile(datamapPath, + FileFactory.getFileType(datamapPath)); + CarbonUtil.deleteFoldersAndFilesSilent(file); + } + } catch (IOException | InterruptedException ex) { + throw new RuntimeException( --- End diff -- Seems throw IOException is better, then catch it and log it in out most caller --- |
In reply to this post by qiuchenjian-2
Github user jackylk commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2463#discussion_r201590946 --- Diff: datamap/lucene/src/main/java/org/apache/carbondata/datamap/lucene/LuceneDataMapFactoryBase.java --- @@ -270,6 +260,22 @@ public void clear() { } + @Override public void deleteDatamapData(Segment segment) { + try { + String segmentId = segment.getSegmentNo(); + String datamapPath = CarbonTablePath + .getDataMapStorePath(tableIdentifier.getTablePath(), segmentId, dataMapName); + if (FileFactory.isFileExist(datamapPath)) { + CarbonFile file = FileFactory.getCarbonFile(datamapPath, + FileFactory.getFileType(datamapPath)); + CarbonUtil.deleteFoldersAndFilesSilent(file); + } + } catch (IOException | InterruptedException ex) { + throw new RuntimeException( --- End diff -- please do the same for line 279 --- |
In reply to this post by qiuchenjian-2
Github user xuchuanyin commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2463#discussion_r201597054 --- Diff: integration/spark2/src/main/scala/org/apache/carbondata/spark/rdd/CarbonDataRDDFactory.scala --- @@ -588,6 +591,21 @@ object CarbonDataRDDFactory { } } + /** + * clear datamap files for segment + */ + private def clearDataMapFiles(carbonTable: CarbonTable, segmentId: String): Unit = { + try { + DataMapStoreManager.getInstance().getAllDataMap(carbonTable).asScala + .filter(_.getDataMapSchema.isIndexDataMap).foreach { p => + p.deleteDatamapData(List(new Segment(segmentId)).asJava) --- End diff -- OK --- |
In reply to this post by qiuchenjian-2
Github user xuchuanyin commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2463#discussion_r201597172 --- Diff: datamap/lucene/src/main/java/org/apache/carbondata/datamap/lucene/LuceneDataMapFactoryBase.java --- @@ -270,6 +260,22 @@ public void clear() { } + @Override public void deleteDatamapData(Segment segment) { --- End diff -- OK --- |
In reply to this post by qiuchenjian-2
Github user xuchuanyin commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2463#discussion_r201601598 --- Diff: datamap/lucene/src/main/java/org/apache/carbondata/datamap/lucene/LuceneDataMapFactoryBase.java --- @@ -270,6 +260,22 @@ public void clear() { } + @Override public void deleteDatamapData(Segment segment) { + try { + String segmentId = segment.getSegmentNo(); + String datamapPath = CarbonTablePath + .getDataMapStorePath(tableIdentifier.getTablePath(), segmentId, dataMapName); + if (FileFactory.isFileExist(datamapPath)) { + CarbonFile file = FileFactory.getCarbonFile(datamapPath, + FileFactory.getFileType(datamapPath)); + CarbonUtil.deleteFoldersAndFilesSilent(file); + } + } catch (IOException | InterruptedException ex) { + throw new RuntimeException( --- End diff -- OK --- |
In reply to this post by qiuchenjian-2
Github user xuchuanyin commented on the issue:
https://github.com/apache/carbondata/pull/2463 review comments are resolved --- |
Free forum by Nabble | Edit this page |