GitHub user jackylk opened a pull request:
https://github.com/apache/carbondata/pull/1467 [CARBONDATA-1669] Clean up code in CarbonDataRDDFactory Inside CarbonDataRDDFactory.loadCarbonData, there are many function defined inside function, makes the loading logic very hard to read This PR improves its readability - [X] Any interfaces changed? No - [X] Any backward compatibility impacted? No - [X] Document update required? No - [X] Testing done No new testcase is required - [X] For large changes, please consider breaking it into sub-tasks under an umbrella JIRA. NA You can merge this pull request into a Git repository by running: $ git pull https://github.com/jackylk/incubator-carbondata refactor_loaddata Alternatively you can review and apply these changes as the patch at: https://github.com/apache/carbondata/pull/1467.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #1467 ---- commit a35b8a2a81c54df100738879a1ec66004efc8ca7 Author: Jacky Li <[hidden email]> Date: 2017-11-05T12:35:09Z remove isTableSplitPartition commit df65121987660b455201ab99abbe063c93e4eb8c Author: Jacky Li <[hidden email]> Date: 2017-11-05T16:41:29Z refactor load data ---- --- |
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1467 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/811/ --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/1467 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/1444/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1467 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/818/ --- |
In reply to this post by qiuchenjian-2
Github user jackylk commented on the issue:
https://github.com/apache/carbondata/pull/1467 retest this please --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1467 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/821/ --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/1467 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/1450/ --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/1467 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/1451/ --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/1467 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/1454/ --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/1467 SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/1472/ --- |
In reply to this post by qiuchenjian-2
Github user jackylk commented on the issue:
https://github.com/apache/carbondata/pull/1467 retest this please --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1467 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/840/ --- |
In reply to this post by qiuchenjian-2
Github user QiangCai commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1467#discussion_r149269115 --- Diff: integration/spark2/src/main/scala/org/apache/spark/sql/execution/command/management/AlterTableCompactionCommand.scala --- @@ -87,4 +94,113 @@ case class AlterTableCompactionCommand( } Seq.empty } + + private def alterTableForCompaction(sqlContext: SQLContext, + alterTableModel: AlterTableModel, + carbonLoadModel: CarbonLoadModel, + storeLocation: String): Unit = { + var compactionSize: Long = 0 + var compactionType: CompactionType = CompactionType.MINOR_COMPACTION + if (alterTableModel.compactionType.equalsIgnoreCase("major")) { + compactionSize = CarbonDataMergerUtil.getCompactionSize(CompactionType.MAJOR_COMPACTION) + compactionType = CompactionType.MAJOR_COMPACTION + } else if (alterTableModel.compactionType + .equalsIgnoreCase(CompactionType.IUD_UPDDEL_DELTA_COMPACTION.toString)) { + compactionType = CompactionType.IUD_UPDDEL_DELTA_COMPACTION + if (alterTableModel.segmentUpdateStatusManager.get != None) { + carbonLoadModel + .setSegmentUpdateStatusManager(alterTableModel.segmentUpdateStatusManager.get) + + carbonLoadModel --- End diff -- I prefer to : carbonLoadModel.setLoadMetadataDetails( alterTableModel.segmentUpdateStatusManager.get.getLoadMetadataDetails.toList.asJava) --- |
In reply to this post by qiuchenjian-2
Github user QiangCai commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1467#discussion_r149267225 --- Diff: integration/spark-common/src/main/scala/org/apache/carbondata/api/CarbonStore.scala --- @@ -79,16 +82,45 @@ object CarbonStore { dbName: String, tableName: String, storePath: String, - carbonTable: CarbonTable, forceTableClean: Boolean): Unit = { + carbonTable: CarbonTable, forceTableClean: Boolean --- End diff -- one parameter one row --- |
In reply to this post by qiuchenjian-2
Github user QiangCai commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1467#discussion_r149267694 --- Diff: integration/spark-common/src/main/scala/org/apache/carbondata/api/CarbonStore.scala --- @@ -79,16 +82,45 @@ object CarbonStore { dbName: String, tableName: String, storePath: String, - carbonTable: CarbonTable, forceTableClean: Boolean): Unit = { + carbonTable: CarbonTable, forceTableClean: Boolean + ): Unit = { --- End diff -- move to last row --- |
In reply to this post by qiuchenjian-2
Github user QiangCai commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1467#discussion_r149267615 --- Diff: integration/spark-common/src/main/scala/org/apache/carbondata/api/CarbonStore.scala --- @@ -79,16 +82,45 @@ object CarbonStore { dbName: String, tableName: String, storePath: String, - carbonTable: CarbonTable, forceTableClean: Boolean): Unit = { + carbonTable: CarbonTable, forceTableClean: Boolean + ): Unit = { LOGGER.audit(s"The clean files request has been received for $dbName.$tableName") + var carbonCleanFilesLock: ICarbonLock = null try { - DataManagementFunc.cleanFiles(dbName, tableName, storePath, carbonTable, forceTableClean) - LOGGER.audit(s"Clean files operation is success for $dbName.$tableName.") + val identifier = new CarbonTableIdentifier(dbName, tableName, "") + carbonCleanFilesLock = + CarbonLockFactory.getCarbonLockObj(identifier, LockUsage.CLEAN_FILES_LOCK) } catch { case ex: Exception => sys.error(ex.getMessage) + return } - Seq.empty + try { --- End diff -- why separate it to two try-catch block --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/1467 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/1473/ --- |
In reply to this post by qiuchenjian-2
Github user jackylk commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1467#discussion_r149279807 --- Diff: integration/spark-common/src/main/scala/org/apache/carbondata/api/CarbonStore.scala --- @@ -79,16 +82,45 @@ object CarbonStore { dbName: String, tableName: String, storePath: String, - carbonTable: CarbonTable, forceTableClean: Boolean): Unit = { + carbonTable: CarbonTable, forceTableClean: Boolean --- End diff -- fixed --- |
In reply to this post by qiuchenjian-2
Github user jackylk commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1467#discussion_r149279822 --- Diff: integration/spark-common/src/main/scala/org/apache/carbondata/api/CarbonStore.scala --- @@ -79,16 +82,45 @@ object CarbonStore { dbName: String, tableName: String, storePath: String, - carbonTable: CarbonTable, forceTableClean: Boolean): Unit = { + carbonTable: CarbonTable, forceTableClean: Boolean + ): Unit = { --- End diff -- fixed --- |
In reply to this post by qiuchenjian-2
Github user jackylk commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1467#discussion_r149281235 --- Diff: integration/spark-common/src/main/scala/org/apache/carbondata/api/CarbonStore.scala --- @@ -79,16 +82,45 @@ object CarbonStore { dbName: String, tableName: String, storePath: String, - carbonTable: CarbonTable, forceTableClean: Boolean): Unit = { + carbonTable: CarbonTable, forceTableClean: Boolean + ): Unit = { LOGGER.audit(s"The clean files request has been received for $dbName.$tableName") + var carbonCleanFilesLock: ICarbonLock = null try { - DataManagementFunc.cleanFiles(dbName, tableName, storePath, carbonTable, forceTableClean) - LOGGER.audit(s"Clean files operation is success for $dbName.$tableName.") + val identifier = new CarbonTableIdentifier(dbName, tableName, "") + carbonCleanFilesLock = + CarbonLockFactory.getCarbonLockObj(identifier, LockUsage.CLEAN_FILES_LOCK) } catch { case ex: Exception => sys.error(ex.getMessage) + return } - Seq.empty + try { --- End diff -- fixed --- |
Free forum by Nabble | Edit this page |