Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2619 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7848/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2619 Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/6573/ --- |
In reply to this post by qiuchenjian-2
Github user Sssan520 commented on the issue:
https://github.com/apache/carbondata/pull/2619 retest this please --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2619 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7851/ --- |
In reply to this post by qiuchenjian-2
Github user xuchuanyin commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2619#discussion_r208818835 --- Diff: core/src/main/java/org/apache/carbondata/core/util/SessionParams.java --- @@ -213,13 +217,20 @@ private boolean validateKeyValue(String key, String value) throws InvalidConfigu isValid = true; } else if (key.startsWith(CarbonCommonConstants.CARBON_DATAMAP_VISIBLE)) { String[] keyArray = key.split("\\."); - isValid = DataMapStoreManager.getInstance().isDataMapExist( - keyArray[keyArray.length - 3], - keyArray[keyArray.length - 2], - keyArray[keyArray.length - 1]); - if (!isValid) { + try { + DataMapSchema dataMapSchema = + DataMapStoreManager.getInstance().getDataMapSchema(keyArray[keyArray.length - 1]); + if (DataMapClassProvider.PREAGGREGATE.getShortName() --- End diff -- This command is only works for index datamap, so please optimize it again --- |
In reply to this post by qiuchenjian-2
Github user xuchuanyin commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2619#discussion_r208821154 --- Diff: integration/spark2/src/main/scala/org/apache/spark/sql/execution/command/datamap/CarbonDropDataMapCommand.scala --- @@ -114,12 +114,6 @@ case class CarbonDropDataMapCommand( LOGGER.audit(s"Deleting datamap [$dataMapName] under table [$tableName]") // drop index datamap on the main table - if (mainTable != null && --- End diff -- Can you explain why this modification is needed? --- |
In reply to this post by qiuchenjian-2
Github user xuchuanyin commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2619#discussion_r208818959 --- Diff: examples/spark2/src/main/scala/org/apache/carbondata/examples/PreAggregateDataMapExample.scala --- @@ -175,7 +175,7 @@ object PreAggregateDataMapExample { // create pre-aggregate table by datamap spark.sql(""" - CREATE datamap preagg_avg on table personTable using 'preaggregate' as + CREATE datamap preagg_avg_dm on table personTable using 'preaggregate' as --- End diff -- why this modification is needed? --- |
In reply to this post by qiuchenjian-2
Github user xuchuanyin commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2619#discussion_r208819917 --- Diff: integration/spark2/src/main/java/org/apache/carbondata/datamap/PreAggregateDataMapProvider.java --- @@ -77,13 +79,15 @@ private void validateDmProperty(DataMapSchema dataMapSchema) } @Override - public void cleanMeta() { + public void cleanMeta() throws IOException { + DataMapSchema dataMapSchema = getDataMapSchema(); dropTableCommand = new CarbonDropTableCommand( true, new Some<>(dbName), tableName, true); dropTableCommand.processMetadata(sparkSession); + DataMapStoreManager.getInstance().dropDataMapSchema(dataMapSchema.getDataMapName()); --- End diff -- Can you check how MV deal with this? In another word, what's the logic of dropping a MV? --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2619 Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/6577/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2619 Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/6605/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2619 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7882/ --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/2619 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/6244/ --- |
In reply to this post by qiuchenjian-2
Github user Sssan520 commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2619#discussion_r209205953 --- Diff: examples/spark2/src/main/scala/org/apache/carbondata/examples/PreAggregateDataMapExample.scala --- @@ -175,7 +175,7 @@ object PreAggregateDataMapExample { // create pre-aggregate table by datamap spark.sql(""" - CREATE datamap preagg_avg on table personTable using 'preaggregate' as + CREATE datamap preagg_avg_dm on table personTable using 'preaggregate' as --- End diff -- As datamap name is a global, there is already the same datamap name before. --- |
In reply to this post by qiuchenjian-2
Github user Sssan520 commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2619#discussion_r209206775 --- Diff: integration/spark2/src/main/java/org/apache/carbondata/datamap/PreAggregateDataMapProvider.java --- @@ -77,13 +79,15 @@ private void validateDmProperty(DataMapSchema dataMapSchema) } @Override - public void cleanMeta() { + public void cleanMeta() throws IOException { + DataMapSchema dataMapSchema = getDataMapSchema(); dropTableCommand = new CarbonDropTableCommand( true, new Some<>(dbName), tableName, true); dropTableCommand.processMetadata(sparkSession); + DataMapStoreManager.getInstance().dropDataMapSchema(dataMapSchema.getDataMapName()); --- End diff -- when create a mv datamap, will create a datamap scheme file and register a datamap catalog; then when drop the mv datamap,will delete the datamap schema file and unregister the datamap catalog. --- |
In reply to this post by qiuchenjian-2
Github user Sssan520 commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2619#discussion_r209208751 --- Diff: integration/spark2/src/main/scala/org/apache/spark/sql/execution/command/datamap/CarbonDropDataMapCommand.scala --- @@ -114,12 +114,6 @@ case class CarbonDropDataMapCommand( LOGGER.audit(s"Deleting datamap [$dataMapName] under table [$tableName]") // drop index datamap on the main table - if (mainTable != null && --- End diff -- when drop preagg datamap , should delete the datamap schema file and main tableinfo schema info. --- |
In reply to this post by qiuchenjian-2
Github user Sssan520 commented on the issue:
https://github.com/apache/carbondata/pull/2619 retest this please --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/2619 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/6245/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2619 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/7883/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2619 Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/6607/ --- |
In reply to this post by qiuchenjian-2
Github user Sssan520 commented on the issue:
https://github.com/apache/carbondata/pull/2619 retest this please --- |
Free forum by Nabble | Edit this page |