Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/1781 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/2904/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1781 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/2836/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1781 Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/1601/ --- |
In reply to this post by qiuchenjian-2
Github user kunal642 commented on the issue:
https://github.com/apache/carbondata/pull/1781 retest this please --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1781 Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/1611/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1781 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/2846/ --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/1781 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/2922/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1781 Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/1846/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1781 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/3077/ --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/1781 SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/3066/ --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1781#discussion_r163762604 --- Diff: core/src/main/java/org/apache/carbondata/core/util/path/CarbonTablePath.java --- @@ -252,6 +252,10 @@ public String getTableStatusFilePath() { return getMetaDataDir() + File.separator + TABLE_STATUS_FILE; } + public String getTableStatusFilePath(String uuid) { --- End diff -- change name to `getTableStatusFilePathWithUUID` --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1781#discussion_r163763228 --- Diff: processing/src/main/java/org/apache/carbondata/processing/util/CarbonLoaderUtil.java --- @@ -363,7 +368,7 @@ public static String getEscapeChar(String escapeCharacter) { public static void readAndUpdateLoadProgressInTableMeta(CarbonLoadModel model, - boolean insertOverwrite) throws IOException { + boolean insertOverwrite, String uuid) throws IOException { --- End diff -- Better don't change the method, add a new method with UUID and call the same method internally. --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1781#discussion_r163763834 --- Diff: integration/spark2/src/main/scala/org/apache/carbondata/spark/rdd/CarbonDataRDDFactory.scala --- @@ -541,9 +543,11 @@ object CarbonDataRDDFactory { carbonLoadModel, loadStatus, newEntryLoadStatus, - overwriteTable) + overwriteTable, + uniqueTableStatusId) + PreAggregateUtil.commitDataMaps(carbonLoadModel, uniqueTableStatusId)(sqlContext.sparkSession) --- End diff -- Add a listener for postStatusUpdateEvent and commit in that listner --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1781#discussion_r163763929 --- Diff: integration/spark2/src/main/scala/org/apache/carbondata/spark/rdd/CarbonDataRDDFactory.scala --- @@ -541,9 +543,11 @@ object CarbonDataRDDFactory { carbonLoadModel, loadStatus, newEntryLoadStatus, - overwriteTable) + overwriteTable, + uniqueTableStatusId) + PreAggregateUtil.commitDataMaps(carbonLoadModel, uniqueTableStatusId)(sqlContext.sparkSession) if (!done) { - CarbonLoaderUtil.updateTableStatusForFailure(carbonLoadModel) + CarbonLoaderUtil.updateTableStatusForFailure(carbonLoadModel, uniqueTableStatusId) --- End diff -- Add a abort event to abort all child datamaps status files --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1781#discussion_r163763954 --- Diff: integration/spark2/src/main/scala/org/apache/spark/sql/CarbonSession.scala --- @@ -75,6 +75,8 @@ class CarbonSession(@transient val sc: SparkContext, object CarbonSession { + private var isInitialized = false --- End diff -- What it is used for? --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1781#discussion_r163765303 --- Diff: integration/spark2/src/main/scala/org/apache/spark/sql/execution/command/management/CarbonLoadDataCommand.scala --- @@ -240,11 +247,12 @@ case class CarbonLoadDataCommand( table.getCarbonTableIdentifier, carbonLoadModel) OperationListenerBus.getInstance.fireEvent(loadTablePostExecutionEvent, operationContext) + } catch { case CausedBy(ex: NoRetryException) => // update the load entry in table status file for changing the status to marked for delete if (isUpdateTableStatusRequired) { - CarbonLoaderUtil.updateTableStatusForFailure(carbonLoadModel) + CarbonLoaderUtil.updateTableStatusForFailure(carbonLoadModel, uuid) --- End diff -- Add a new method, don't change the current method signature --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1781#discussion_r163765433 --- Diff: integration/spark2/src/main/scala/org/apache/spark/sql/execution/command/preaaggregate/PreAggregateUtil.scala --- @@ -587,7 +590,8 @@ object PreAggregateUtil { segmentToLoad: String, validateSegments: Boolean, isOverwrite: Boolean, - sparkSession: SparkSession): Unit = { + sparkSession: SparkSession, + uuid: String): Unit = { --- End diff -- take this as Option and default as None --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1781#discussion_r163765562 --- Diff: integration/spark2/src/main/scala/org/apache/spark/sql/execution/command/preaaggregate/PreAggregateUtil.scala --- @@ -900,4 +904,75 @@ object PreAggregateUtil { aggDataMapSchema.getProperties.get("CHILD_SELECT QUERY").replace("&", "=")), CarbonCommonConstants.DEFAULT_CHARSET) } + + def commitDataMaps( --- End diff -- Add comment step by step of flow. --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1781#discussion_r163765606 --- Diff: integration/spark2/src/main/scala/org/apache/spark/sql/execution/command/preaaggregate/PreAggregateUtil.scala --- @@ -900,4 +904,75 @@ object PreAggregateUtil { aggDataMapSchema.getProperties.get("CHILD_SELECT QUERY").replace("&", "=")), CarbonCommonConstants.DEFAULT_CHARSET) } + + def commitDataMaps( + carbonLoadModel: CarbonLoadModel, + uuid: String = "")(sparkSession: SparkSession) { --- End diff -- take this a Option and default to None --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1781#discussion_r163765733 --- Diff: integration/spark2/src/main/scala/org/apache/spark/sql/execution/command/preaaggregate/PreAggregateUtil.scala --- @@ -900,4 +904,75 @@ object PreAggregateUtil { aggDataMapSchema.getProperties.get("CHILD_SELECT QUERY").replace("&", "=")), CarbonCommonConstants.DEFAULT_CHARSET) } + + def commitDataMaps( + carbonLoadModel: CarbonLoadModel, + uuid: String = "")(sparkSession: SparkSession) { + val dataMapSchemas = carbonLoadModel.getCarbonDataLoadSchema.getCarbonTable.getTableInfo + .getDataMapSchemaList --- End diff -- enter to new line ``` carbonLoadModel.getCarbonDataLoadSchema.getCarbonTable.getTableInfo.getDataMapSchemaList ``` --- |
Free forum by Nabble | Edit this page |