[GitHub] [carbondata] akkio-97 commented on a change in pull request #4129: [CARBONDATA-4179] Support renaming of complex columns (array/struct)

Posted by GitBox on
URL: http://apache-carbondata-dev-mailing-list-archive.168.s1.nabble.com/GitHub-carbondata-akkio-97-opened-a-new-pull-request-4129-WIP-alter-rename-complex-types-tp108015p108737.html


akkio-97 commented on a change in pull request #4129:
URL: https://github.com/apache/carbondata/pull/4129#discussion_r645697097



##########
File path: integration/spark/src/main/scala/org/apache/spark/sql/execution/command/schema/CarbonAlterTableColRenameDataTypeChangeCommand.scala
##########
@@ -143,27 +143,49 @@ private[sql] case class CarbonAlterTableColRenameDataTypeChangeCommand(
       val newColumnPrecision = alterTableColRenameAndDataTypeChangeModel.dataTypeInfo.precision
       val newColumnScale = alterTableColRenameAndDataTypeChangeModel.dataTypeInfo.scale
       // set isDataTypeChange flag
-      if (oldCarbonColumn.head.getDataType.getName
-        .equalsIgnoreCase(alterTableColRenameAndDataTypeChangeModel.dataTypeInfo.dataType)) {
+      val oldDatatype = oldCarbonColumn.head.getDataType
+      val newDatatype = alterTableColRenameAndDataTypeChangeModel.dataTypeInfo.dataType
+      if (isColumnRename && (DataTypes.isMapType(oldDatatype) ||
+                             newDatatype.equalsIgnoreCase(CarbonCommonConstants.MAP))) {
+        throw new UnsupportedOperationException(
+          "Alter rename is unsupported for Map datatype column")
+      }
+      if (oldDatatype.getName.equalsIgnoreCase(newDatatype)) {
         val newColumnPrecision =
           alterTableColRenameAndDataTypeChangeModel.dataTypeInfo.precision
         val newColumnScale = alterTableColRenameAndDataTypeChangeModel.dataTypeInfo.scale
         // if the source datatype is decimal and there is change in precision and scale, then
         // along with rename, datatype change is also required for the command, so set the
         // isDataTypeChange flag to true in this case
-        if (oldCarbonColumn.head.getDataType.getName.equalsIgnoreCase("decimal") &&
-            (oldCarbonColumn.head.getDataType.asInstanceOf[DecimalType].getPrecision !=
+        if (DataTypes.isDecimal(oldDatatype) &&
+            (oldDatatype.asInstanceOf[DecimalType].getPrecision !=
              newColumnPrecision ||
-             oldCarbonColumn.head.getDataType.asInstanceOf[DecimalType].getScale !=
+             oldDatatype.asInstanceOf[DecimalType].getScale !=
              newColumnScale)) {
           isDataTypeChange = true
         }
+        if (DataTypes.isArrayType(oldDatatype) || DataTypes.isStructType(oldDatatype)) {
+          val oldParent = oldCarbonColumn.head
+          val oldChildren = oldParent.asInstanceOf[CarbonDimension].getListOfChildDimensions.asScala
+            .toList
+          AlterTableUtil.validateComplexStructure(oldChildren,
+            alterTableColRenameAndDataTypeChangeModel.dataTypeInfo.getChildren(),
+            alteredColumnNamesMap)
+        }
       } else {
+        if (oldDatatype.isComplexType ||
+            newDatatype.equalsIgnoreCase(CarbonCommonConstants.ARRAY) ||
+            newDatatype.equalsIgnoreCase(CarbonCommonConstants.STRUCT)) {
+          throw new UnsupportedOperationException(
+            "Old and new complex columns are not compatible in structure")

Review comment:
       done

##########
File path: integration/spark/src/main/scala/org/apache/spark/sql/execution/command/schema/CarbonAlterTableColRenameDataTypeChangeCommand.scala
##########
@@ -143,27 +143,49 @@ private[sql] case class CarbonAlterTableColRenameDataTypeChangeCommand(
       val newColumnPrecision = alterTableColRenameAndDataTypeChangeModel.dataTypeInfo.precision
       val newColumnScale = alterTableColRenameAndDataTypeChangeModel.dataTypeInfo.scale
       // set isDataTypeChange flag
-      if (oldCarbonColumn.head.getDataType.getName
-        .equalsIgnoreCase(alterTableColRenameAndDataTypeChangeModel.dataTypeInfo.dataType)) {
+      val oldDatatype = oldCarbonColumn.head.getDataType
+      val newDatatype = alterTableColRenameAndDataTypeChangeModel.dataTypeInfo.dataType
+      if (isColumnRename && (DataTypes.isMapType(oldDatatype) ||
+                             newDatatype.equalsIgnoreCase(CarbonCommonConstants.MAP))) {
+        throw new UnsupportedOperationException(
+          "Alter rename is unsupported for Map datatype column")

Review comment:
       check should be on isColumnRename too because, comments can be changed in case of maps.
   And it is dependent on oldCarbonColumn, which is instantiated right above it. So I think exception is thrown at the right place. You can suggest a place if required.

##########
File path: integration/spark/src/main/scala/org/apache/spark/util/AlterTableUtil.scala
##########
@@ -357,18 +358,23 @@ object AlterTableUtil {
   /**
    * This method create a new SchemaEvolutionEntry and adds to SchemaEvolutionEntry List
    *
-   * @param addColumnSchema          added new column schema
-   * @param deletedColumnSchema      old column schema which is deleted
+   * @param addedColumnsList    list of added column schemas
+   * @param deletedColumnsList  list of deleted column schemas
    * @return
    */
   def addNewSchemaEvolutionEntry(
-      timeStamp: Long,
-      addColumnSchema: org.apache.carbondata.format.ColumnSchema,
-      deletedColumnSchema: org.apache.carbondata.format.ColumnSchema): SchemaEvolutionEntry = {
-    val schemaEvolutionEntry = new SchemaEvolutionEntry(timeStamp)
-    schemaEvolutionEntry.setAdded(List(addColumnSchema).asJava)
-    schemaEvolutionEntry.setRemoved(List(deletedColumnSchema).asJava)
-    schemaEvolutionEntry
+      schemaEvolutionEntry: SchemaEvolutionEntry,
+      addedColumnsList: List[org.apache.carbondata.format.ColumnSchema],
+      deletedColumnsList: List[org.apache.carbondata.format.ColumnSchema]): SchemaEvolutionEntry = {
+    val timeStamp = System.currentTimeMillis()
+    var newSchemaEvolutionEntry = schemaEvolutionEntry;
+    if (newSchemaEvolutionEntry == null) {
+      newSchemaEvolutionEntry = new SchemaEvolutionEntry(timeStamp)
+    }
+    newSchemaEvolutionEntry.setTime_stamp(timeStamp)

Review comment:
       done

##########
File path: integration/spark/src/main/scala/org/apache/spark/sql/execution/command/schema/CarbonAlterTableColRenameDataTypeChangeCommand.scala
##########
@@ -178,35 +200,51 @@ private[sql] case class CarbonAlterTableColRenameDataTypeChangeCommand(
             }
         }
       }
-      if (alterTableColRenameAndDataTypeChangeModel.isColumnRename) {
+
+      // read the latest schema file
+      val tableInfo: TableInfo = metaStore.getThriftTableInfo(carbonTable)
+      val columnSchemaList = tableInfo.fact_table.table_columns.asScala.filter(!_.isInvisible)
+      if (!alteredColumnNamesMap.isEmpty) {
         // validate the columns to be renamed
-        validColumnsForRenaming(carbonColumns, oldCarbonColumn.head, carbonTable)
+        validColumnsForRenaming(columnSchemaList, alteredColumnNamesMap, carbonTable)

Review comment:
       done




--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[hidden email]