GitHub user akashrn5 opened a pull request:
https://github.com/apache/carbondata/pull/1980 [CARBONDATA-2103]optimize show tables for filtering datamaps **Problem** Show tables was taking more time as two times lookup was happening to filter out the datamaps **Solution** add a hive table property which is true for all tables and false for datamaps like preAggregate table and show tables filter out these tables based on the property. Be sure to do all of the following checklist to help us incorporate your contribution quickly and easily: - [ ] Any interfaces changed? - [ ] Any backward compatibility impacted? - [ ] Document update required? - [ ] Testing done UT is added Please provide details on - Whether new unit test cases have been added or why no new tests are required? - How it is tested? Please attach test report. - Is it a performance related change? Please attach the performance test report. - Any additional information to help reviewers in testing this change. - [ ] For large changes, please consider breaking it into sub-tasks under an umbrella JIRA. You can merge this pull request into a Git repository by running: $ git pull https://github.com/akashrn5/incubator-carbondata showTables Alternatively you can review and apply these changes as the patch at: https://github.com/apache/carbondata/pull/1980.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #1980 ---- commit 0e03851306ea9ba631be8e0e7c371337e7951dc9 Author: akashrn5 <akashnilugal@...> Date: 2018-02-15T13:30:26Z optimize show tables for filtering datamaps ---- --- |
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1980 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/3749/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1980 Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/2509/ --- |
In reply to this post by qiuchenjian-2
Github user akashrn5 commented on the issue:
https://github.com/apache/carbondata/pull/1980 retest this please --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1980 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/3750/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1980 Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/2510/ --- |
In reply to this post by qiuchenjian-2
Github user kunal642 commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1980#discussion_r168687437 --- Diff: integration/spark2/src/main/scala/org/apache/spark/sql/execution/command/table/CarbonShowTablesCommand.scala --- @@ -44,39 +44,15 @@ private[sql] case class CarbonShowTablesCommand ( databaseName: Option[String], val db = databaseName.getOrElse(catalog.getCurrentDatabase) var tables = tableIdentifierPattern.map(catalog.listTables(db, _)).getOrElse(catalog.listTables(db)) - tables = filterDataMaps(tables, sparkSession) + val externalCatalog = sparkSession.sharedState.externalCatalog + // tables will be filtered for all the dataMaps to show only main tables + tables = tables + .filter(table => externalCatalog.getTable(db, table.table).storage.properties + .getOrElse("isCarbonTableVisibility", true).toString.toBoolean) tables.map { tableIdent => val isTemp = catalog.isTemporaryTable(tableIdent) Row(tableIdent.database.getOrElse("default"), tableIdent.table, isTemp) --- End diff -- instead of looping twice we can use collect like this: tables.collect { case tableIdent if externalCatalog.getTable(db, tableIdent.table).storage.properties.getOrElse("isCarbonTableVisibility", true) => val isTemp = catalog.isTemporaryTable(tableIdent) Row(tableIdent.database.getOrElse("default"), tableIdent.table, isTemp) case _ => Row() } --- |
In reply to this post by qiuchenjian-2
Github user kunal642 commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1980#discussion_r168687592 --- Diff: integration/spark2/src/main/scala/org/apache/spark/sql/execution/command/table/CarbonShowTablesCommand.scala --- @@ -44,39 +44,15 @@ private[sql] case class CarbonShowTablesCommand ( databaseName: Option[String], val db = databaseName.getOrElse(catalog.getCurrentDatabase) var tables = tableIdentifierPattern.map(catalog.listTables(db, _)).getOrElse(catalog.listTables(db)) - tables = filterDataMaps(tables, sparkSession) + val externalCatalog = sparkSession.sharedState.externalCatalog --- End diff -- Update the comment "filterDataMaps Method is to Filter the Table." as filterDataMaps is removed --- |
In reply to this post by qiuchenjian-2
Github user kunal642 commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1980#discussion_r168687633 --- Diff: integration/spark2/src/main/scala/org/apache/spark/sql/execution/command/table/CarbonShowTablesCommand.scala --- @@ -44,39 +44,15 @@ private[sql] case class CarbonShowTablesCommand ( databaseName: Option[String], val db = databaseName.getOrElse(catalog.getCurrentDatabase) var tables = tableIdentifierPattern.map(catalog.listTables(db, _)).getOrElse(catalog.listTables(db)) - tables = filterDataMaps(tables, sparkSession) + val externalCatalog = sparkSession.sharedState.externalCatalog + // tables will be filtered for all the dataMaps to show only main tables + tables = tables + .filter(table => externalCatalog.getTable(db, table.table).storage.properties --- End diff -- instead of looping twice we can use collect like this: tables.collect { case tableIdent if externalCatalog.getTable(db, tableIdent.table).storage.properties.getOrElse("isCarbonTableVisibility", true) => val isTemp = catalog.isTemporaryTable(tableIdent) Row(tableIdent.database.getOrElse("default"), tableIdent.table, isTemp) case _ => Row() } --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1980 Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/2512/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1980 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/3752/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1980 Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/2513/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1980 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/3753/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1980 Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/2549/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1980 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/3793/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1980 Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/2551/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1980 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/3795/ --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/1980 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/3583/ --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/1980 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/3585/ --- |
In reply to this post by qiuchenjian-2
Github user manishgupta88 commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1980#discussion_r169895776 --- Diff: integration/spark2/src/main/scala/org/apache/spark/sql/execution/command/table/CarbonCreateTableCommand.scala --- @@ -110,7 +111,8 @@ case class CarbonCreateTableCommand( | tableName "$tableName", | dbName "$dbName", | tablePath "$tablePath", - | path "$tablePath" + | path "$tablePath", + | isCarbonTableVisibility "$isCarbonTableVisibility" --- End diff -- 1. Rename isCarbonTableVisibility to isCarbonTableVisibile 2. Add a comment to specify in which case this property will be used and what is the benefit of using this property --- |
Free forum by Nabble | Edit this page |