GitHub user chenerlu opened a pull request:
https://github.com/apache/carbondata/pull/1657 [CARBONDATA-1895] Fix issue of create table if not exists Be sure to do all of the following checklist to help us incorporate your contribution quickly and easily: - [ ] Any interfaces changed? No - [ ] Any backward compatibility impacted? No - [ ] Document update required? No - [ ] Testing done Already add test case in project. - [ ] For large changes, please consider breaking it into sub-tasks under an umbrella JIRA. You can merge this pull request into a Git repository by running: $ git pull https://github.com/chenerlu/incubator-carbondata pr-1212 Alternatively you can review and apply these changes as the patch at: https://github.com/apache/carbondata/pull/1657.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #1657 ---- ---- --- |
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1657 Build Failed with Spark 2.2.0, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/710/ --- |
In reply to this post by qiuchenjian-2
Github user chenerlu commented on the issue:
https://github.com/apache/carbondata/pull/1657 retest this please --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1657 Build Failed with Spark 2.2.0, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/716/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1657 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/1943/ --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/1657 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/2271/ --- |
In reply to this post by qiuchenjian-2
Github user chenerlu commented on the issue:
https://github.com/apache/carbondata/pull/1657 retest this please --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1657 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/1953/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1657 Build Failed with Spark 2.2.0, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/725/ --- |
In reply to this post by qiuchenjian-2
Github user chenerlu commented on the issue:
https://github.com/apache/carbondata/pull/1657 retest this please --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1657 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/1955/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1657 Build Failed with Spark 2.2.0, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/727/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1657 Build Success with Spark 2.2.0, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/750/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1657 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/1978/ --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/1657 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/2300/ --- |
In reply to this post by qiuchenjian-2
Github user chenerlu commented on the issue:
https://github.com/apache/carbondata/pull/1657 retest this please --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1657 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/1982/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1657 Build Success with Spark 2.2.0, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/752/ --- |
In reply to this post by qiuchenjian-2
Github user jackylk commented on the issue:
https://github.com/apache/carbondata/pull/1657 Please modify the description of the PR --- |
In reply to this post by qiuchenjian-2
Github user jackylk commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1657#discussion_r157111874 --- Diff: integration/spark2/src/main/scala/org/apache/spark/sql/execution/command/table/CarbonCreateTableCommand.scala --- @@ -62,66 +62,66 @@ case class CarbonCreateTableCommand( s"Table [$tableName] already exists under database [$dbName]") throw new TableAlreadyExistsException(dbName, tableName) } - } - - val tablePath = tableLocation.getOrElse( - CarbonEnv.getTablePath(Some(dbName), tableName)(sparkSession)) - tableInfo.setTablePath(tablePath) - val tableIdentifier = AbsoluteTableIdentifier.from(tablePath, dbName, tableName) + } else { + val tablePath = tableLocation.getOrElse( + CarbonEnv.getTablePath(Some(dbName), tableName)(sparkSession)) + tableInfo.setTablePath(tablePath) + val tableIdentifier = AbsoluteTableIdentifier.from(tablePath, dbName, tableName) - // Add validation for sort scope when create table - val sortScope = tableInfo.getFactTable.getTableProperties.asScala - .getOrElse("sort_scope", CarbonCommonConstants.LOAD_SORT_SCOPE_DEFAULT) - if (!CarbonUtil.isValidSortOption(sortScope)) { - throw new InvalidConfigurationException( - s"Passing invalid SORT_SCOPE '$sortScope', valid SORT_SCOPE are 'NO_SORT', 'BATCH_SORT'," + - s" 'LOCAL_SORT' and 'GLOBAL_SORT' ") - } + // Add validation for sort scope when create table + val sortScope = tableInfo.getFactTable.getTableProperties.asScala + .getOrElse("sort_scope", CarbonCommonConstants.LOAD_SORT_SCOPE_DEFAULT) + if (!CarbonUtil.isValidSortOption(sortScope)) { + throw new InvalidConfigurationException( + s"Passing invalid SORT_SCOPE '$sortScope', valid SORT_SCOPE are 'NO_SORT'," + + s" 'BATCH_SORT', 'LOCAL_SORT' and 'GLOBAL_SORT' ") + } - if (tableInfo.getFactTable.getListOfColumns.size <= 0) { - CarbonException.analysisException("Table should have at least one column.") - } + if (tableInfo.getFactTable.getListOfColumns.size <= 0) { + CarbonException.analysisException("Table should have at least one column.") + } - val operationContext = new OperationContext - val createTablePreExecutionEvent: CreateTablePreExecutionEvent = - CreateTablePreExecutionEvent(sparkSession, tableIdentifier, Some(tableInfo)) - OperationListenerBus.getInstance.fireEvent(createTablePreExecutionEvent, operationContext) - val catalog = CarbonEnv.getInstance(sparkSession).carbonMetastore - val carbonSchemaString = catalog.generateTableSchemaString(tableInfo, tableIdentifier) - if (createDSTable) { - try { - val tablePath = tableIdentifier.getTablePath - val carbonRelation = CarbonSparkUtil.createCarbonRelation(tableInfo, tablePath) - val rawSchema = CarbonSparkUtil.getRawSchema(carbonRelation) - sparkSession.sparkContext.setLocalProperty(EXECUTION_ID_KEY, null) - sparkSession.sql( - s"""CREATE TABLE $dbName.$tableName - |(${ rawSchema }) - |USING org.apache.spark.sql.CarbonSource - |OPTIONS ( - | tableName "$tableName", - | dbName "$dbName", - | tablePath "$tablePath", - | path "$tablePath" - | $carbonSchemaString) - """.stripMargin) - } catch { - case e: AnalysisException => throw e - case e: Exception => - // call the drop table to delete the created table. - CarbonEnv.getInstance(sparkSession).carbonMetastore - .dropTable(tableIdentifier)(sparkSession) + val operationContext = new OperationContext + val createTablePreExecutionEvent: CreateTablePreExecutionEvent = + CreateTablePreExecutionEvent(sparkSession, tableIdentifier, Some(tableInfo)) + OperationListenerBus.getInstance.fireEvent(createTablePreExecutionEvent, operationContext) + val catalog = CarbonEnv.getInstance(sparkSession).carbonMetastore + val carbonSchemaString = catalog.generateTableSchemaString(tableInfo, tableIdentifier) + if (createDSTable) { + try { + val tablePath = tableIdentifier.getTablePath + val carbonRelation = CarbonSparkUtil.createCarbonRelation(tableInfo, tablePath) + val rawSchema = CarbonSparkUtil.getRawSchema(carbonRelation) + sparkSession.sparkContext.setLocalProperty(EXECUTION_ID_KEY, null) + sparkSession.sql( + s"""CREATE TABLE $dbName.$tableName + |(${ rawSchema }) + |USING org.apache.spark.sql.CarbonSource + |OPTIONS ( + | tableName "$tableName", + | dbName "$dbName", + | tablePath "$tablePath", + | path "$tablePath" + | $carbonSchemaString) + """.stripMargin).collect() --- End diff -- collect is not required --- |
Free forum by Nabble | Edit this page |