Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2207 Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/4352/ --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/2207 SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/4617/ --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/2207 SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/4618/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2207 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/5527/ --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/2207 SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/4629/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2207 Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/4363/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2207 Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/4404/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2207 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/5566/ --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/2207 SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/4662/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2207 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/5574/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2207 Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/4413/ --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/2207 SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/4668/ --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2207#discussion_r185525675 --- Diff: integration/spark2/src/main/scala/org/apache/spark/sql/execution/command/mutation/DeleteExecution.scala --- @@ -127,12 +126,16 @@ object DeleteExecution { var result = List[(SegmentStatus, (SegmentUpdateDetails, ExecutionErrors))]() while (records.hasNext) { val ((key), (rowCountDetailsVO, groupedRows)) = records.next + val segmentId = key.substring(0, key.indexOf(CarbonCommonConstants.FILE_SEPARATOR)) --- End diff -- Not possible as in above method it adds segmentid. --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2207#discussion_r185525720 --- Diff: integration/spark2/src/main/scala/org/apache/spark/sql/execution/command/mutation/DeleteExecution.scala --- @@ -114,6 +109,10 @@ object DeleteExecution { CarbonUpdateUtil .createBlockDetailsMap(blockMappingVO, segmentUpdateStatusMngr) + val metadataDetails = SegmentStatusManager.readTableStatusFile( + CarbonTablePath.getTableStatusFilePath(carbonTable.getTablePath)) + + --- End diff -- ok --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2207#discussion_r185525770 --- Diff: core/src/main/java/org/apache/carbondata/core/mutate/CarbonUpdateUtil.java --- @@ -80,18 +80,18 @@ public static String getSegmentWithBlockFromTID(String Tid) { * Returns block path from tuple id * * @param tid - * @param factPath + * @param tablePath --- End diff -- ok --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2207#discussion_r185525800 --- Diff: core/src/main/java/org/apache/carbondata/core/metadata/SegmentFileStore.java --- @@ -162,21 +167,44 @@ public static String writeSegmentFile(String tablePath, String segmentId, String for (CarbonFile file : indexFiles) { folderDetails.getFiles().add(file.getName()); } - String segmentRelativePath = segmentPath.substring(tablePath.length(), segmentPath.length()); + String segmentRelativePath = "/"; + if (!supportFlatFolder) { + segmentRelativePath = segmentPath.substring(tablePath.length(), segmentPath.length()); + } segmentFile.addPath(segmentRelativePath, folderDetails); - String segmentFileFolder = CarbonTablePath.getSegmentFilesLocation(tablePath); + String segmentFileFolder = CarbonTablePath.getSegmentFilesLocation(tablePath); CarbonFile carbonFile = FileFactory.getCarbonFile(segmentFileFolder); if (!carbonFile.exists()) { carbonFile.mkdirs(segmentFileFolder, FileFactory.getFileType(segmentFileFolder)); } String segmentFileName = genSegmentFileName(segmentId, UUID) + CarbonTablePath.SEGMENT_EXT; // write segment info to new file. writeSegmentFile(segmentFile, segmentFileFolder + File.separator + segmentFileName); + + // Move all files to table path from segment folder. + if (supportFlatFolder) { + moveFromTempFolder(segmentPath, tablePath); + } + return segmentFileName; } return null; } + /** + * Move the loaded data from temp folder to respective partition folder. + * @param tablePath --- End diff -- ok --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2207#discussion_r185525923 --- Diff: integration/spark-common-test/src/test/scala/org/apache/carbondata/spark/testsuite/datacompaction/CompactionSupportGlobalSortBigFileTest.scala --- @@ -72,8 +72,8 @@ class CompactionSupportGlobalSortBigFileTest extends QueryTest with BeforeAndAft } override def afterEach { - sql("DROP TABLE IF EXISTS compaction_globalsort") - sql("DROP TABLE IF EXISTS carbon_localsort") +// sql("DROP TABLE IF EXISTS compaction_globalsort") --- End diff -- ok --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2207 Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/4431/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2207 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/5592/ --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/2207 SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/4686/ --- |
Free forum by Nabble | Edit this page |