[GitHub] [carbondata] akkio-97 opened a new pull request #4121: [CARBONDATA-4170] Support dropping of single and multi-level complex columns(array/struct)

classic Classic list List threaded Threaded
64 messages Options
1234
Reply | Threaded
Open this post in threaded view
|

[GitHub] [carbondata] akkio-97 opened a new pull request #4121: [CARBONDATA-4170] Support dropping of single and multi-level complex columns(array/struct)

GitBox

akkio-97 opened a new pull request #4121:
URL: https://github.com/apache/carbondata/pull/4121


    ### Why is this PR needed?
    This PR supports dropping of complex columns (single and multi-level) from the carbon table. Dropping of parent column will in turn drop all of its children columns too.
   
    ### What changes were proposed in this PR?
   Test cases have been written up to 3-levels only.
       
    ### Does this PR introduce any user interface change?
    - No
   
    ### Is any new testcase added?
    - Yes
   
       
   


--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[hidden email]


Reply | Threaded
Open this post in threaded view
|

[GitHub] [carbondata] Indhumathi27 commented on a change in pull request #4121: [CARBONDATA-4170] Support dropping of parent complex columns(array/struct)

GitBox

Indhumathi27 commented on a change in pull request #4121:
URL: https://github.com/apache/carbondata/pull/4121#discussion_r617595919



##########
File path: integration/spark/src/main/scala/org/apache/spark/sql/execution/command/schema/CarbonAlterTableDropColumnCommand.scala
##########
@@ -147,19 +143,28 @@ private[sql] case class CarbonAlterTableDropColumnCommand(
         metastore.getThriftTableInfo(carbonTable)
       // maintain the deleted columns for schema evolution history
       var deletedColumnSchema = ListBuffer[org.apache.carbondata.format.ColumnSchema]()
+      var deletedParentColumns = ListBuffer[org.apache.carbondata.format.ColumnSchema]()
       val columnSchemaList = tableInfo.fact_table.table_columns.asScala
       alterTableDropColumnModel.columns.foreach { column =>
         columnSchemaList.foreach { columnSchema =>
-          if (!columnSchema.invisible && column.equalsIgnoreCase(columnSchema.column_name)) {
-            deletedColumnSchema += columnSchema.deepCopy
-            columnSchema.invisible = true
+          if (!columnSchema.invisible) {
+            if (column.equalsIgnoreCase(columnSchema.column_name)) {
+              deletedParentColumns += columnSchema.deepCopy

Review comment:
       please reuse column schema copy object

##########
File path: integration/spark/src/test/scala/org/apache/spark/carbondata/restructure/vectorreader/DropColumnTestCases.scala
##########
@@ -97,13 +102,127 @@ class DropColumnTestCases extends QueryTest with BeforeAndAfterAll {
     test_drop_and_compaction()
   }
 
-  test("test dropping of complex column should throw exception") {
-    sql("drop table if exists maintbl")
-    sql("create table maintbl (a string, b string, c struct<si:int>) STORED AS carbondata")
-    assert(intercept[ProcessMetaDataException] {
-      sql("alter table maintbl drop columns(b,c)").collect()
-    }.getMessage.contains("Complex column cannot be dropped"))
-    sql("drop table if exists maintbl")
+  def checkSchemaSize(): Unit = {
+    val schema = sql("describe alter_com").collect()
+    assert(schema.size == 1)
+  }
+
+  def droppedColumnsInSchemaEvolutionEntry(tableName: String): Seq[ColumnSchema] = {
+    val carbonTable = CarbonEnv.getCarbonTable(None, tableName)(sqlContext.sparkSession)
+    val schemaEvolutionList = carbonTable.getTableInfo
+      .getFactTable
+      .getSchemaEvolution()
+      .getSchemaEvolutionEntryList()
+    var droppedColumns = Seq[ColumnSchema]()
+    for (i <- 0 until schemaEvolutionList.size()) {
+      droppedColumns ++=
+      JavaConverters
+        .asScalaIteratorConverter(schemaEvolutionList.get(i).getRemoved.iterator())
+        .asScala
+        .toSeq
+    }
+    droppedColumns
+  }
+
+  test("test dropping of array of all primitive types") {
+    import scala.collection.mutable.WrappedArray.make
+    sql("DROP TABLE IF EXISTS alter_com")
+    sql("CREATE TABLE alter_com(intfield int, arr1 array<short>, arr2 array<int>, arr3 " +
+        "array<long>, arr4 array<double>, arr5 array<decimal(8,2)>, arr6 array<string>, arr7 " +
+        "array<char(5)>, arr8 array<varchar(50)>, arr9 array<boolean>, arr10 " +
+        "array<date>, arr11 array<timestamp>) STORED AS carbondata")
+    sql(
+      "insert into alter_com values(1,array(1,5),array(1,2),array(1,2,3),array(1.2d,2.3d),array" +
+      "(4.5,6.7),array('hello','world'),array('a','bcd'),array('abcd','efg'),array(true,false)," +
+      "array('2017-02-01','2018-09-11'),array('2017-02-01 00:01:00','2018-02-01 02:21:00') )")
+    sql("ALTER TABLE alter_com DROP COLUMNS(arr1,arr2,arr3,arr4,arr5,arr6) ")
+    sql("ALTER TABLE alter_com DROP COLUMNS(arr7,arr8,arr9) ")
+    sql("ALTER TABLE alter_com DROP COLUMNS(arr10,arr11) ")

Review comment:
       please add a check for drop duplicate complex column

##########
File path: integration/spark/src/test/scala/org/apache/spark/carbondata/restructure/vectorreader/DropColumnTestCases.scala
##########
@@ -97,13 +102,127 @@ class DropColumnTestCases extends QueryTest with BeforeAndAfterAll {
     test_drop_and_compaction()
   }
 
-  test("test dropping of complex column should throw exception") {
-    sql("drop table if exists maintbl")
-    sql("create table maintbl (a string, b string, c struct<si:int>) STORED AS carbondata")
-    assert(intercept[ProcessMetaDataException] {
-      sql("alter table maintbl drop columns(b,c)").collect()
-    }.getMessage.contains("Complex column cannot be dropped"))
-    sql("drop table if exists maintbl")
+  def checkSchemaSize(): Unit = {
+    val schema = sql("describe alter_com").collect()
+    assert(schema.size == 1)
+  }
+
+  def droppedColumnsInSchemaEvolutionEntry(tableName: String): Seq[ColumnSchema] = {
+    val carbonTable = CarbonEnv.getCarbonTable(None, tableName)(sqlContext.sparkSession)
+    val schemaEvolutionList = carbonTable.getTableInfo
+      .getFactTable
+      .getSchemaEvolution()
+      .getSchemaEvolutionEntryList()
+    var droppedColumns = Seq[ColumnSchema]()
+    for (i <- 0 until schemaEvolutionList.size()) {
+      droppedColumns ++=
+      JavaConverters
+        .asScalaIteratorConverter(schemaEvolutionList.get(i).getRemoved.iterator())
+        .asScala
+        .toSeq
+    }
+    droppedColumns
+  }
+
+  test("test dropping of array of all primitive types") {

Review comment:
       please add testcases for drop Map complex column also

##########
File path: integration/spark/src/main/scala/org/apache/spark/sql/execution/command/schema/CarbonAlterTableDropColumnCommand.scala
##########
@@ -147,19 +143,28 @@ private[sql] case class CarbonAlterTableDropColumnCommand(
         metastore.getThriftTableInfo(carbonTable)
       // maintain the deleted columns for schema evolution history
       var deletedColumnSchema = ListBuffer[org.apache.carbondata.format.ColumnSchema]()
+      var deletedParentColumns = ListBuffer[org.apache.carbondata.format.ColumnSchema]()
       val columnSchemaList = tableInfo.fact_table.table_columns.asScala
       alterTableDropColumnModel.columns.foreach { column =>
         columnSchemaList.foreach { columnSchema =>
-          if (!columnSchema.invisible && column.equalsIgnoreCase(columnSchema.column_name)) {
-            deletedColumnSchema += columnSchema.deepCopy
-            columnSchema.invisible = true
+          if (!columnSchema.invisible) {
+            if (column.equalsIgnoreCase(columnSchema.column_name)) {
+              deletedParentColumns += columnSchema.deepCopy
+              deletedColumnSchema += columnSchema.deepCopy
+              columnSchema.invisible = true
+            }
+            // complex children are prefixed with -> parent_name + '.'
+            if (columnSchema.column_name.contains(CarbonCommonConstants.POINT)) {

Review comment:
       please validate if starts with parentname. and add a testcase with more than one complex column in table and drop any complex column

##########
File path: integration/spark/src/test/scala/org/apache/spark/carbondata/restructure/vectorreader/DropColumnTestCases.scala
##########
@@ -97,13 +102,127 @@ class DropColumnTestCases extends QueryTest with BeforeAndAfterAll {
     test_drop_and_compaction()
   }
 
-  test("test dropping of complex column should throw exception") {
-    sql("drop table if exists maintbl")
-    sql("create table maintbl (a string, b string, c struct<si:int>) STORED AS carbondata")
-    assert(intercept[ProcessMetaDataException] {
-      sql("alter table maintbl drop columns(b,c)").collect()
-    }.getMessage.contains("Complex column cannot be dropped"))
-    sql("drop table if exists maintbl")
+  def checkSchemaSize(): Unit = {
+    val schema = sql("describe alter_com").collect()
+    assert(schema.size == 1)
+  }
+
+  def droppedColumnsInSchemaEvolutionEntry(tableName: String): Seq[ColumnSchema] = {
+    val carbonTable = CarbonEnv.getCarbonTable(None, tableName)(sqlContext.sparkSession)
+    val schemaEvolutionList = carbonTable.getTableInfo
+      .getFactTable
+      .getSchemaEvolution()
+      .getSchemaEvolutionEntryList()
+    var droppedColumns = Seq[ColumnSchema]()
+    for (i <- 0 until schemaEvolutionList.size()) {
+      droppedColumns ++=
+      JavaConverters
+        .asScalaIteratorConverter(schemaEvolutionList.get(i).getRemoved.iterator())
+        .asScala
+        .toSeq
+    }
+    droppedColumns
+  }
+
+  test("test dropping of array of all primitive types") {
+    import scala.collection.mutable.WrappedArray.make
+    sql("DROP TABLE IF EXISTS alter_com")
+    sql("CREATE TABLE alter_com(intfield int, arr1 array<short>, arr2 array<int>, arr3 " +
+        "array<long>, arr4 array<double>, arr5 array<decimal(8,2)>, arr6 array<string>, arr7 " +
+        "array<char(5)>, arr8 array<varchar(50)>, arr9 array<boolean>, arr10 " +
+        "array<date>, arr11 array<timestamp>) STORED AS carbondata")
+    sql(
+      "insert into alter_com values(1,array(1,5),array(1,2),array(1,2,3),array(1.2d,2.3d),array" +
+      "(4.5,6.7),array('hello','world'),array('a','bcd'),array('abcd','efg'),array(true,false)," +
+      "array('2017-02-01','2018-09-11'),array('2017-02-01 00:01:00','2018-02-01 02:21:00') )")
+    sql("ALTER TABLE alter_com DROP COLUMNS(arr1,arr2,arr3,arr4,arr5,arr6) ")
+    sql("ALTER TABLE alter_com DROP COLUMNS(arr7,arr8,arr9) ")
+    sql("ALTER TABLE alter_com DROP COLUMNS(arr10,arr11) ")
+
+    checkSchemaSize
+    checkAnswer(sql("select * from alter_com"), Seq(Row(1)))
+    val droppedColumns = droppedColumnsInSchemaEvolutionEntry("alter_com")
+    assert(droppedColumns.size == 11)
+    // check adding columns with same names again
+    sql(
+      "ALTER TABLE alter_com ADD COLUMNS(arr1 array<short>, arr2 array<int>, arr3 " +
+      "array<long>, arr4 array<double>, arr5 array<decimal(8,2)>, arr6 array<string>, arr7 " +
+      "array<char(5)>, arr8 array<varchar(50)>, arr9 array<boolean>, arr10 array<date>, arr11 " +
+      "array<timestamp> )")
+    val columns = sql("desc table alter_com").collect()
+    assert(columns.size == 12)
+    sql(
+      "insert into alter_com values(2,array(2,5),array(2,2),array(2,2,3),array(2.2d,2.3d),array" +
+      "(2.5,6.7),array('hello2','world'),array('a2','bcd'),array('abcd2','efg'),array(true,false)" +
+      ",array('2017-02-01','2018-09-11'),array('2017-02-01 00:01:00','2018-02-01 02:21:00') )")
+    checkAnswer(sql(
+      "select * from alter_com"),
+      Seq(Row(1, null, null, null, null, null, null, null, null, null, null, null), Row(2,
+        make(Array(2, 5)),
+        make(Array(2, 2)),
+        make(Array(2, 2, 3)),
+        make(Array(2.2, 2.3)),
+        make(Array(java.math.BigDecimal.valueOf(2.5).setScale(2),
+          java.math.BigDecimal.valueOf(6.7).setScale(2))),
+        make(Array("hello2", "world")),
+        make(Array("a2", "bcd")),
+        make(Array("abcd2", "efg")),
+        make(Array(true, false)),
+        make(Array(Date.valueOf("2017-02-01"),
+          Date.valueOf("2018-09-11"))),
+        make(Array(Timestamp.valueOf("2017-02-01 00:01:00"),
+          Timestamp.valueOf("2018-02-01 02:21:00")))
+      )))
+  }
+
+  test("test dropping of struct of all primitive types") {
+    sql("DROP TABLE IF EXISTS alter_com")
+    sql("CREATE TABLE alter_com(intField INT,struct1 struct<a:short,b:int,c:long,d:double," +
+        "e:decimal(8,2),f:string,g:char(5),h:varchar(50),i:boolean,j:date,k:timestamp>) " +
+        "STORED AS carbondata")
+    sql("insert into alter_com values(1, named_struct('a',1,'b',2,'c',3,'d',1.23,'e',2.34,'f'," +
+        "'hello','g','abc','h','def','i',true,'j','2017-02-01','k','2018-02-01 02:00:00.0') ) ")
+    sql("ALTER TABLE alter_com DROP COLUMNS(struct1) ")
+    var columns = sql("desc table alter_com").collect()

Review comment:
       please reuse checkschema method




--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[hidden email]


Reply | Threaded
Open this post in threaded view
|

[GitHub] [carbondata] CarbonDataQA2 commented on pull request #4121: [CARBONDATA-4170] Support dropping of parent complex columns(array/struct)

GitBox
In reply to this post by GitBox

CarbonDataQA2 commented on pull request #4121:
URL: https://github.com/apache/carbondata/pull/4121#issuecomment-824164030


   Build Failed  with Spark 2.3.4, Please check CI http://121.244.95.60:12602/job/ApacheCarbonPRBuilder2.3/5221/
   


--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[hidden email]


Reply | Threaded
Open this post in threaded view
|

[GitHub] [carbondata] CarbonDataQA2 commented on pull request #4121: [CARBONDATA-4170] Support dropping of parent complex columns(array/struct)

GitBox
In reply to this post by GitBox

CarbonDataQA2 commented on pull request #4121:
URL: https://github.com/apache/carbondata/pull/4121#issuecomment-824166657


   Build Failed  with Spark 2.4.5, Please check CI http://121.244.95.60:12602/job/ApacheCarbon_PR_Builder_2.4.5/3473/
   


--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[hidden email]


Reply | Threaded
Open this post in threaded view
|

[GitHub] [carbondata] akkio-97 commented on a change in pull request #4121: [CARBONDATA-4170] Support dropping of parent complex columns(array/struct)

GitBox
In reply to this post by GitBox

akkio-97 commented on a change in pull request #4121:
URL: https://github.com/apache/carbondata/pull/4121#discussion_r618154029



##########
File path: integration/spark/src/test/scala/org/apache/spark/carbondata/restructure/vectorreader/DropColumnTestCases.scala
##########
@@ -97,13 +102,127 @@ class DropColumnTestCases extends QueryTest with BeforeAndAfterAll {
     test_drop_and_compaction()
   }
 
-  test("test dropping of complex column should throw exception") {
-    sql("drop table if exists maintbl")
-    sql("create table maintbl (a string, b string, c struct<si:int>) STORED AS carbondata")
-    assert(intercept[ProcessMetaDataException] {
-      sql("alter table maintbl drop columns(b,c)").collect()
-    }.getMessage.contains("Complex column cannot be dropped"))
-    sql("drop table if exists maintbl")
+  def checkSchemaSize(): Unit = {
+    val schema = sql("describe alter_com").collect()
+    assert(schema.size == 1)
+  }
+
+  def droppedColumnsInSchemaEvolutionEntry(tableName: String): Seq[ColumnSchema] = {
+    val carbonTable = CarbonEnv.getCarbonTable(None, tableName)(sqlContext.sparkSession)
+    val schemaEvolutionList = carbonTable.getTableInfo
+      .getFactTable
+      .getSchemaEvolution()
+      .getSchemaEvolutionEntryList()
+    var droppedColumns = Seq[ColumnSchema]()
+    for (i <- 0 until schemaEvolutionList.size()) {
+      droppedColumns ++=
+      JavaConverters
+        .asScalaIteratorConverter(schemaEvolutionList.get(i).getRemoved.iterator())
+        .asScala
+        .toSeq
+    }
+    droppedColumns
+  }
+
+  test("test dropping of array of all primitive types") {
+    import scala.collection.mutable.WrappedArray.make
+    sql("DROP TABLE IF EXISTS alter_com")
+    sql("CREATE TABLE alter_com(intfield int, arr1 array<short>, arr2 array<int>, arr3 " +
+        "array<long>, arr4 array<double>, arr5 array<decimal(8,2)>, arr6 array<string>, arr7 " +
+        "array<char(5)>, arr8 array<varchar(50)>, arr9 array<boolean>, arr10 " +
+        "array<date>, arr11 array<timestamp>) STORED AS carbondata")
+    sql(
+      "insert into alter_com values(1,array(1,5),array(1,2),array(1,2,3),array(1.2d,2.3d),array" +
+      "(4.5,6.7),array('hello','world'),array('a','bcd'),array('abcd','efg'),array(true,false)," +
+      "array('2017-02-01','2018-09-11'),array('2017-02-01 00:01:00','2018-02-01 02:21:00') )")
+    sql("ALTER TABLE alter_com DROP COLUMNS(arr1,arr2,arr3,arr4,arr5,arr6) ")
+    sql("ALTER TABLE alter_com DROP COLUMNS(arr7,arr8,arr9) ")
+    sql("ALTER TABLE alter_com DROP COLUMNS(arr10,arr11) ")

Review comment:
       done




--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[hidden email]


Reply | Threaded
Open this post in threaded view
|

[GitHub] [carbondata] akkio-97 commented on a change in pull request #4121: [CARBONDATA-4170] Support dropping of parent complex columns(array/struct)

GitBox
In reply to this post by GitBox

akkio-97 commented on a change in pull request #4121:
URL: https://github.com/apache/carbondata/pull/4121#discussion_r618154169



##########
File path: integration/spark/src/test/scala/org/apache/spark/carbondata/restructure/vectorreader/DropColumnTestCases.scala
##########
@@ -97,13 +102,127 @@ class DropColumnTestCases extends QueryTest with BeforeAndAfterAll {
     test_drop_and_compaction()
   }
 
-  test("test dropping of complex column should throw exception") {
-    sql("drop table if exists maintbl")
-    sql("create table maintbl (a string, b string, c struct<si:int>) STORED AS carbondata")
-    assert(intercept[ProcessMetaDataException] {
-      sql("alter table maintbl drop columns(b,c)").collect()
-    }.getMessage.contains("Complex column cannot be dropped"))
-    sql("drop table if exists maintbl")
+  def checkSchemaSize(): Unit = {
+    val schema = sql("describe alter_com").collect()
+    assert(schema.size == 1)
+  }
+
+  def droppedColumnsInSchemaEvolutionEntry(tableName: String): Seq[ColumnSchema] = {
+    val carbonTable = CarbonEnv.getCarbonTable(None, tableName)(sqlContext.sparkSession)
+    val schemaEvolutionList = carbonTable.getTableInfo
+      .getFactTable
+      .getSchemaEvolution()
+      .getSchemaEvolutionEntryList()
+    var droppedColumns = Seq[ColumnSchema]()
+    for (i <- 0 until schemaEvolutionList.size()) {
+      droppedColumns ++=
+      JavaConverters
+        .asScalaIteratorConverter(schemaEvolutionList.get(i).getRemoved.iterator())
+        .asScala
+        .toSeq
+    }
+    droppedColumns
+  }
+
+  test("test dropping of array of all primitive types") {

Review comment:
       done

##########
File path: integration/spark/src/main/scala/org/apache/spark/sql/execution/command/schema/CarbonAlterTableDropColumnCommand.scala
##########
@@ -147,19 +143,28 @@ private[sql] case class CarbonAlterTableDropColumnCommand(
         metastore.getThriftTableInfo(carbonTable)
       // maintain the deleted columns for schema evolution history
       var deletedColumnSchema = ListBuffer[org.apache.carbondata.format.ColumnSchema]()
+      var deletedParentColumns = ListBuffer[org.apache.carbondata.format.ColumnSchema]()
       val columnSchemaList = tableInfo.fact_table.table_columns.asScala
       alterTableDropColumnModel.columns.foreach { column =>
         columnSchemaList.foreach { columnSchema =>
-          if (!columnSchema.invisible && column.equalsIgnoreCase(columnSchema.column_name)) {
-            deletedColumnSchema += columnSchema.deepCopy
-            columnSchema.invisible = true
+          if (!columnSchema.invisible) {
+            if (column.equalsIgnoreCase(columnSchema.column_name)) {
+              deletedParentColumns += columnSchema.deepCopy
+              deletedColumnSchema += columnSchema.deepCopy
+              columnSchema.invisible = true
+            }
+            // complex children are prefixed with -> parent_name + '.'
+            if (columnSchema.column_name.contains(CarbonCommonConstants.POINT)) {

Review comment:
       done

##########
File path: integration/spark/src/main/scala/org/apache/spark/sql/execution/command/schema/CarbonAlterTableDropColumnCommand.scala
##########
@@ -147,19 +143,28 @@ private[sql] case class CarbonAlterTableDropColumnCommand(
         metastore.getThriftTableInfo(carbonTable)
       // maintain the deleted columns for schema evolution history
       var deletedColumnSchema = ListBuffer[org.apache.carbondata.format.ColumnSchema]()
+      var deletedParentColumns = ListBuffer[org.apache.carbondata.format.ColumnSchema]()
       val columnSchemaList = tableInfo.fact_table.table_columns.asScala
       alterTableDropColumnModel.columns.foreach { column =>
         columnSchemaList.foreach { columnSchema =>
-          if (!columnSchema.invisible && column.equalsIgnoreCase(columnSchema.column_name)) {
-            deletedColumnSchema += columnSchema.deepCopy
-            columnSchema.invisible = true
+          if (!columnSchema.invisible) {
+            if (column.equalsIgnoreCase(columnSchema.column_name)) {
+              deletedParentColumns += columnSchema.deepCopy

Review comment:
       done

##########
File path: integration/spark/src/test/scala/org/apache/spark/carbondata/restructure/vectorreader/DropColumnTestCases.scala
##########
@@ -97,13 +102,127 @@ class DropColumnTestCases extends QueryTest with BeforeAndAfterAll {
     test_drop_and_compaction()
   }
 
-  test("test dropping of complex column should throw exception") {
-    sql("drop table if exists maintbl")
-    sql("create table maintbl (a string, b string, c struct<si:int>) STORED AS carbondata")
-    assert(intercept[ProcessMetaDataException] {
-      sql("alter table maintbl drop columns(b,c)").collect()
-    }.getMessage.contains("Complex column cannot be dropped"))
-    sql("drop table if exists maintbl")
+  def checkSchemaSize(): Unit = {
+    val schema = sql("describe alter_com").collect()
+    assert(schema.size == 1)
+  }
+
+  def droppedColumnsInSchemaEvolutionEntry(tableName: String): Seq[ColumnSchema] = {
+    val carbonTable = CarbonEnv.getCarbonTable(None, tableName)(sqlContext.sparkSession)
+    val schemaEvolutionList = carbonTable.getTableInfo
+      .getFactTable
+      .getSchemaEvolution()
+      .getSchemaEvolutionEntryList()
+    var droppedColumns = Seq[ColumnSchema]()
+    for (i <- 0 until schemaEvolutionList.size()) {
+      droppedColumns ++=
+      JavaConverters
+        .asScalaIteratorConverter(schemaEvolutionList.get(i).getRemoved.iterator())
+        .asScala
+        .toSeq
+    }
+    droppedColumns
+  }
+
+  test("test dropping of array of all primitive types") {
+    import scala.collection.mutable.WrappedArray.make
+    sql("DROP TABLE IF EXISTS alter_com")
+    sql("CREATE TABLE alter_com(intfield int, arr1 array<short>, arr2 array<int>, arr3 " +
+        "array<long>, arr4 array<double>, arr5 array<decimal(8,2)>, arr6 array<string>, arr7 " +
+        "array<char(5)>, arr8 array<varchar(50)>, arr9 array<boolean>, arr10 " +
+        "array<date>, arr11 array<timestamp>) STORED AS carbondata")
+    sql(
+      "insert into alter_com values(1,array(1,5),array(1,2),array(1,2,3),array(1.2d,2.3d),array" +
+      "(4.5,6.7),array('hello','world'),array('a','bcd'),array('abcd','efg'),array(true,false)," +
+      "array('2017-02-01','2018-09-11'),array('2017-02-01 00:01:00','2018-02-01 02:21:00') )")
+    sql("ALTER TABLE alter_com DROP COLUMNS(arr1,arr2,arr3,arr4,arr5,arr6) ")
+    sql("ALTER TABLE alter_com DROP COLUMNS(arr7,arr8,arr9) ")
+    sql("ALTER TABLE alter_com DROP COLUMNS(arr10,arr11) ")
+
+    checkSchemaSize
+    checkAnswer(sql("select * from alter_com"), Seq(Row(1)))
+    val droppedColumns = droppedColumnsInSchemaEvolutionEntry("alter_com")
+    assert(droppedColumns.size == 11)
+    // check adding columns with same names again
+    sql(
+      "ALTER TABLE alter_com ADD COLUMNS(arr1 array<short>, arr2 array<int>, arr3 " +
+      "array<long>, arr4 array<double>, arr5 array<decimal(8,2)>, arr6 array<string>, arr7 " +
+      "array<char(5)>, arr8 array<varchar(50)>, arr9 array<boolean>, arr10 array<date>, arr11 " +
+      "array<timestamp> )")
+    val columns = sql("desc table alter_com").collect()
+    assert(columns.size == 12)
+    sql(
+      "insert into alter_com values(2,array(2,5),array(2,2),array(2,2,3),array(2.2d,2.3d),array" +
+      "(2.5,6.7),array('hello2','world'),array('a2','bcd'),array('abcd2','efg'),array(true,false)" +
+      ",array('2017-02-01','2018-09-11'),array('2017-02-01 00:01:00','2018-02-01 02:21:00') )")
+    checkAnswer(sql(
+      "select * from alter_com"),
+      Seq(Row(1, null, null, null, null, null, null, null, null, null, null, null), Row(2,
+        make(Array(2, 5)),
+        make(Array(2, 2)),
+        make(Array(2, 2, 3)),
+        make(Array(2.2, 2.3)),
+        make(Array(java.math.BigDecimal.valueOf(2.5).setScale(2),
+          java.math.BigDecimal.valueOf(6.7).setScale(2))),
+        make(Array("hello2", "world")),
+        make(Array("a2", "bcd")),
+        make(Array("abcd2", "efg")),
+        make(Array(true, false)),
+        make(Array(Date.valueOf("2017-02-01"),
+          Date.valueOf("2018-09-11"))),
+        make(Array(Timestamp.valueOf("2017-02-01 00:01:00"),
+          Timestamp.valueOf("2018-02-01 02:21:00")))
+      )))
+  }
+
+  test("test dropping of struct of all primitive types") {
+    sql("DROP TABLE IF EXISTS alter_com")
+    sql("CREATE TABLE alter_com(intField INT,struct1 struct<a:short,b:int,c:long,d:double," +
+        "e:decimal(8,2),f:string,g:char(5),h:varchar(50),i:boolean,j:date,k:timestamp>) " +
+        "STORED AS carbondata")
+    sql("insert into alter_com values(1, named_struct('a',1,'b',2,'c',3,'d',1.23,'e',2.34,'f'," +
+        "'hello','g','abc','h','def','i',true,'j','2017-02-01','k','2018-02-01 02:00:00.0') ) ")
+    sql("ALTER TABLE alter_com DROP COLUMNS(struct1) ")
+    var columns = sql("desc table alter_com").collect()

Review comment:
       done




--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[hidden email]


Reply | Threaded
Open this post in threaded view
|

[GitHub] [carbondata] akkio-97 commented on a change in pull request #4121: [CARBONDATA-4170] Support dropping of parent complex columns(array/struct/map)

GitBox
In reply to this post by GitBox

akkio-97 commented on a change in pull request #4121:
URL: https://github.com/apache/carbondata/pull/4121#discussion_r618154199



##########
File path: integration/spark/src/main/scala/org/apache/spark/sql/execution/command/schema/CarbonAlterTableDropColumnCommand.scala
##########
@@ -147,19 +143,28 @@ private[sql] case class CarbonAlterTableDropColumnCommand(
         metastore.getThriftTableInfo(carbonTable)
       // maintain the deleted columns for schema evolution history
       var deletedColumnSchema = ListBuffer[org.apache.carbondata.format.ColumnSchema]()
+      var deletedParentColumns = ListBuffer[org.apache.carbondata.format.ColumnSchema]()
       val columnSchemaList = tableInfo.fact_table.table_columns.asScala
       alterTableDropColumnModel.columns.foreach { column =>
         columnSchemaList.foreach { columnSchema =>
-          if (!columnSchema.invisible && column.equalsIgnoreCase(columnSchema.column_name)) {
-            deletedColumnSchema += columnSchema.deepCopy
-            columnSchema.invisible = true
+          if (!columnSchema.invisible) {
+            if (column.equalsIgnoreCase(columnSchema.column_name)) {
+              deletedParentColumns += columnSchema.deepCopy
+              deletedColumnSchema += columnSchema.deepCopy
+              columnSchema.invisible = true
+            }
+            // complex children are prefixed with -> parent_name + '.'
+            if (columnSchema.column_name.contains(CarbonCommonConstants.POINT)) {

Review comment:
       done




--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[hidden email]


Reply | Threaded
Open this post in threaded view
|

[GitHub] [carbondata] CarbonDataQA2 commented on pull request #4121: [CARBONDATA-4170] Support dropping of parent complex columns(array/struct/map)

GitBox
In reply to this post by GitBox

CarbonDataQA2 commented on pull request #4121:
URL: https://github.com/apache/carbondata/pull/4121#issuecomment-824676733


   Build Success with Spark 2.4.5, Please check CI http://121.244.95.60:12602/job/ApacheCarbon_PR_Builder_2.4.5/3475/
   


--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[hidden email]


Reply | Threaded
Open this post in threaded view
|

[GitHub] [carbondata] CarbonDataQA2 commented on pull request #4121: [CARBONDATA-4170] Support dropping of parent complex columns(array/struct/map)

GitBox
In reply to this post by GitBox

CarbonDataQA2 commented on pull request #4121:
URL: https://github.com/apache/carbondata/pull/4121#issuecomment-824677561


   Build Success with Spark 2.3.4, Please check CI http://121.244.95.60:12602/job/ApacheCarbonPRBuilder2.3/5223/
   


--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[hidden email]


Reply | Threaded
Open this post in threaded view
|

[GitHub] [carbondata] Indhumathi27 commented on a change in pull request #4121: [CARBONDATA-4170] Support dropping of parent complex columns(array/struct/map)

GitBox
In reply to this post by GitBox

Indhumathi27 commented on a change in pull request #4121:
URL: https://github.com/apache/carbondata/pull/4121#discussion_r618252820



##########
File path: integration/spark/src/test/scala/org/apache/spark/carbondata/restructure/vectorreader/DropColumnTestCases.scala
##########
@@ -97,13 +102,158 @@ class DropColumnTestCases extends QueryTest with BeforeAndAfterAll {
     test_drop_and_compaction()
   }
 
-  test("test dropping of complex column should throw exception") {
-    sql("drop table if exists maintbl")
-    sql("create table maintbl (a string, b string, c struct<si:int>) STORED AS carbondata")
-    assert(intercept[ProcessMetaDataException] {
-      sql("alter table maintbl drop columns(b,c)").collect()
-    }.getMessage.contains("Complex column cannot be dropped"))
-    sql("drop table if exists maintbl")
+  def checkSchemaSize(value: Integer): Unit = {
+    val schema = sql("describe alter_com").collect()
+    assert(schema.size.equals(value))
+  }
+
+  def checkDroppedColumnsInSchemaEvolutionEntry(tableName: String, value: Integer): Unit = {
+    val carbonTable = CarbonEnv.getCarbonTable(None, tableName)(sqlContext.sparkSession)
+    val schemaEvolutionList = carbonTable.getTableInfo
+      .getFactTable
+      .getSchemaEvolution()
+      .getSchemaEvolutionEntryList()
+    var droppedColumns = Seq[ColumnSchema]()
+    for (i <- 0 until schemaEvolutionList.size()) {
+      droppedColumns ++=
+      JavaConverters
+        .asScalaIteratorConverter(schemaEvolutionList.get(i).getRemoved.iterator())
+        .asScala
+        .toSeq
+    }
+    assert(droppedColumns.size.equals(value))
+  }
+
+  test("test dropping of array of all primitive types") {
+    import scala.collection.mutable.WrappedArray.make
+    sql("DROP TABLE IF EXISTS alter_com")
+    sql("CREATE TABLE alter_com(intfield int, arr array<int>, arr1 array<short>, " +
+        "arr2 array<int>, arr3 array<long>, arr4 array<double>, arr5 array<decimal(8,2)>, " +
+        "arr6 array<string>, arr7 array<char(5)>, arr8 array<varchar(50)>, arr9 array<boolean>, " +
+        "arr10 array<date>, arr11 array<timestamp>) STORED AS carbondata")
+    sql("insert into alter_com values(1,array(1,5),array(1,5),array(1,2),array(1,2,3)," +
+        "array(1.2d,2.3d),array(4.5,6.7),array('hello','world'),array('a','bcd')," +
+        "array('abcd','efg'),array(true,false),array('2017-02-01','2018-09-11')," +
+        "array('2017-02-01 00:01:00','2018-02-01 02:21:00') )")
+    sql("ALTER TABLE alter_com DROP COLUMNS(arr1,arr2,arr3,arr4,arr5,arr6) ")
+    sql("ALTER TABLE alter_com DROP COLUMNS(arr7,arr8,arr9) ")
+    sql("ALTER TABLE alter_com DROP COLUMNS(arr10,arr11) ")
+    val exception = intercept[Exception] {
+      sql("ALTER TABLE alter_com DROP COLUMNS(arr10,arr10) ")
+    }
+    val exceptionMessage =
+      "arr10 is duplicate. Duplicate columns not allowed"
+    assert(exception.getMessage.contains(exceptionMessage))
+
+    checkSchemaSize(2)
+    checkAnswer(sql("select * from alter_com"), Seq(Row(1, make(Array(1, 5)))))
+    checkDroppedColumnsInSchemaEvolutionEntry("alter_com", 11)
+    // check adding columns with same names again
+    sql(
+      "ALTER TABLE alter_com ADD COLUMNS(arr1 array<short>, arr2 array<int>, arr3 " +
+      "array<long>, arr4 array<double>, arr5 array<decimal(8,2)>, arr6 array<string>, arr7 " +
+      "array<char(5)>, arr8 array<varchar(50)>, arr9 array<boolean>, arr10 array<date>, arr11 " +
+      "array<timestamp> )")
+    val columns = sql("desc table alter_com").collect()
+    assert(columns.size == 13)
+    sql(
+      "insert into alter_com values(2,array(2,5),array(2,5),array(2,2),array(2,2,3),array(2.2d," +
+      "2.3d),array(2.5,6.7),array('hello2','world'),array('a2','bcd'),array('abcd2','efg'),array" +
+      "(true,false), array('2017-02-01','2018-09-11'),array('2017-02-01 00:01:00','2018-02-01 " +
+      "02:21:00') )")
+    checkAnswer(sql(
+      "select * from alter_com"),
+      Seq(Row(1, make(Array(1, 5)), null, null, null, null, null, null, null, null, null, null,
+        null), Row(2,
+        make(Array(2, 5)),
+        make(Array(2, 5)),
+        make(Array(2, 2)),
+        make(Array(2, 2, 3)),
+        make(Array(2.2, 2.3)),
+        make(Array(java.math.BigDecimal.valueOf(2.5).setScale(2),
+          java.math.BigDecimal.valueOf(6.7).setScale(2))),
+        make(Array("hello2", "world")),
+        make(Array("a2", "bcd")),
+        make(Array("abcd2", "efg")),
+        make(Array(true, false)),
+        make(Array(Date.valueOf("2017-02-01"),
+          Date.valueOf("2018-09-11"))),
+        make(Array(Timestamp.valueOf("2017-02-01 00:01:00"),
+          Timestamp.valueOf("2018-02-01 02:21:00")))
+      )))
+  }
+
+  test("test dropping of struct of all primitive types") {

Review comment:
       Can add a drop complex testcase scenario in existing testcase of Complex array with SI




--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[hidden email]


Reply | Threaded
Open this post in threaded view
|

[GitHub] [carbondata] akkio-97 commented on a change in pull request #4121: [CARBONDATA-4170] Support dropping of parent complex columns(array/struct/map)

GitBox
In reply to this post by GitBox

akkio-97 commented on a change in pull request #4121:
URL: https://github.com/apache/carbondata/pull/4121#discussion_r618357602



##########
File path: integration/spark/src/test/scala/org/apache/spark/carbondata/restructure/vectorreader/DropColumnTestCases.scala
##########
@@ -97,13 +102,158 @@ class DropColumnTestCases extends QueryTest with BeforeAndAfterAll {
     test_drop_and_compaction()
   }
 
-  test("test dropping of complex column should throw exception") {
-    sql("drop table if exists maintbl")
-    sql("create table maintbl (a string, b string, c struct<si:int>) STORED AS carbondata")
-    assert(intercept[ProcessMetaDataException] {
-      sql("alter table maintbl drop columns(b,c)").collect()
-    }.getMessage.contains("Complex column cannot be dropped"))
-    sql("drop table if exists maintbl")
+  def checkSchemaSize(value: Integer): Unit = {
+    val schema = sql("describe alter_com").collect()
+    assert(schema.size.equals(value))
+  }
+
+  def checkDroppedColumnsInSchemaEvolutionEntry(tableName: String, value: Integer): Unit = {
+    val carbonTable = CarbonEnv.getCarbonTable(None, tableName)(sqlContext.sparkSession)
+    val schemaEvolutionList = carbonTable.getTableInfo
+      .getFactTable
+      .getSchemaEvolution()
+      .getSchemaEvolutionEntryList()
+    var droppedColumns = Seq[ColumnSchema]()
+    for (i <- 0 until schemaEvolutionList.size()) {
+      droppedColumns ++=
+      JavaConverters
+        .asScalaIteratorConverter(schemaEvolutionList.get(i).getRemoved.iterator())
+        .asScala
+        .toSeq
+    }
+    assert(droppedColumns.size.equals(value))
+  }
+
+  test("test dropping of array of all primitive types") {
+    import scala.collection.mutable.WrappedArray.make
+    sql("DROP TABLE IF EXISTS alter_com")
+    sql("CREATE TABLE alter_com(intfield int, arr array<int>, arr1 array<short>, " +
+        "arr2 array<int>, arr3 array<long>, arr4 array<double>, arr5 array<decimal(8,2)>, " +
+        "arr6 array<string>, arr7 array<char(5)>, arr8 array<varchar(50)>, arr9 array<boolean>, " +
+        "arr10 array<date>, arr11 array<timestamp>) STORED AS carbondata")
+    sql("insert into alter_com values(1,array(1,5),array(1,5),array(1,2),array(1,2,3)," +
+        "array(1.2d,2.3d),array(4.5,6.7),array('hello','world'),array('a','bcd')," +
+        "array('abcd','efg'),array(true,false),array('2017-02-01','2018-09-11')," +
+        "array('2017-02-01 00:01:00','2018-02-01 02:21:00') )")
+    sql("ALTER TABLE alter_com DROP COLUMNS(arr1,arr2,arr3,arr4,arr5,arr6) ")
+    sql("ALTER TABLE alter_com DROP COLUMNS(arr7,arr8,arr9) ")
+    sql("ALTER TABLE alter_com DROP COLUMNS(arr10,arr11) ")
+    val exception = intercept[Exception] {
+      sql("ALTER TABLE alter_com DROP COLUMNS(arr10,arr10) ")
+    }
+    val exceptionMessage =
+      "arr10 is duplicate. Duplicate columns not allowed"
+    assert(exception.getMessage.contains(exceptionMessage))
+
+    checkSchemaSize(2)
+    checkAnswer(sql("select * from alter_com"), Seq(Row(1, make(Array(1, 5)))))
+    checkDroppedColumnsInSchemaEvolutionEntry("alter_com", 11)
+    // check adding columns with same names again
+    sql(
+      "ALTER TABLE alter_com ADD COLUMNS(arr1 array<short>, arr2 array<int>, arr3 " +
+      "array<long>, arr4 array<double>, arr5 array<decimal(8,2)>, arr6 array<string>, arr7 " +
+      "array<char(5)>, arr8 array<varchar(50)>, arr9 array<boolean>, arr10 array<date>, arr11 " +
+      "array<timestamp> )")
+    val columns = sql("desc table alter_com").collect()
+    assert(columns.size == 13)
+    sql(
+      "insert into alter_com values(2,array(2,5),array(2,5),array(2,2),array(2,2,3),array(2.2d," +
+      "2.3d),array(2.5,6.7),array('hello2','world'),array('a2','bcd'),array('abcd2','efg'),array" +
+      "(true,false), array('2017-02-01','2018-09-11'),array('2017-02-01 00:01:00','2018-02-01 " +
+      "02:21:00') )")
+    checkAnswer(sql(
+      "select * from alter_com"),
+      Seq(Row(1, make(Array(1, 5)), null, null, null, null, null, null, null, null, null, null,
+        null), Row(2,
+        make(Array(2, 5)),
+        make(Array(2, 5)),
+        make(Array(2, 2)),
+        make(Array(2, 2, 3)),
+        make(Array(2.2, 2.3)),
+        make(Array(java.math.BigDecimal.valueOf(2.5).setScale(2),
+          java.math.BigDecimal.valueOf(6.7).setScale(2))),
+        make(Array("hello2", "world")),
+        make(Array("a2", "bcd")),
+        make(Array("abcd2", "efg")),
+        make(Array(true, false)),
+        make(Array(Date.valueOf("2017-02-01"),
+          Date.valueOf("2018-09-11"))),
+        make(Array(Timestamp.valueOf("2017-02-01 00:01:00"),
+          Timestamp.valueOf("2018-02-01 02:21:00")))
+      )))
+  }
+
+  test("test dropping of struct of all primitive types") {

Review comment:
       done




--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[hidden email]


Reply | Threaded
Open this post in threaded view
|

[GitHub] [carbondata] CarbonDataQA2 commented on pull request #4121: [CARBONDATA-4170] Support dropping of parent complex columns(array/struct/map)

GitBox
In reply to this post by GitBox

CarbonDataQA2 commented on pull request #4121:
URL: https://github.com/apache/carbondata/pull/4121#issuecomment-824874939


   Build Failed  with Spark 2.3.4, Please check CI http://121.244.95.60:12602/job/ApacheCarbonPRBuilder2.3/5233/
   


--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[hidden email]


Reply | Threaded
Open this post in threaded view
|

[GitHub] [carbondata] CarbonDataQA2 commented on pull request #4121: [CARBONDATA-4170] Support dropping of parent complex columns(array/struct/map)

GitBox
In reply to this post by GitBox

CarbonDataQA2 commented on pull request #4121:
URL: https://github.com/apache/carbondata/pull/4121#issuecomment-824885591


   Build Failed  with Spark 2.4.5, Please check CI http://121.244.95.60:12602/job/ApacheCarbon_PR_Builder_2.4.5/3485/
   


--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[hidden email]


Reply | Threaded
Open this post in threaded view
|

[GitHub] [carbondata] CarbonDataQA2 commented on pull request #4121: [CARBONDATA-4170] Support dropping of parent complex columns(array/struct/map)

GitBox
In reply to this post by GitBox

CarbonDataQA2 commented on pull request #4121:
URL: https://github.com/apache/carbondata/pull/4121#issuecomment-825083829


   Build Failed  with Spark 2.3.4, Please check CI http://121.244.95.60:12602/job/ApacheCarbonPRBuilder2.3/5236/
   


--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[hidden email]


Reply | Threaded
Open this post in threaded view
|

[GitHub] [carbondata] CarbonDataQA2 commented on pull request #4121: [CARBONDATA-4170] Support dropping of parent complex columns(array/struct/map)

GitBox
In reply to this post by GitBox

CarbonDataQA2 commented on pull request #4121:
URL: https://github.com/apache/carbondata/pull/4121#issuecomment-825085182


   Build Failed  with Spark 2.4.5, Please check CI http://121.244.95.60:12602/job/ApacheCarbon_PR_Builder_2.4.5/3488/
   


--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[hidden email]


Reply | Threaded
Open this post in threaded view
|

[GitHub] [carbondata] CarbonDataQA2 commented on pull request #4121: [CARBONDATA-4170] Support dropping of parent complex columns(array/struct/map)

GitBox
In reply to this post by GitBox

CarbonDataQA2 commented on pull request #4121:
URL: https://github.com/apache/carbondata/pull/4121#issuecomment-825163338


   Build Failed  with Spark 2.3.4, Please check CI http://121.244.95.60:12602/job/ApacheCarbonPRBuilder2.3/5237/
   


--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[hidden email]


Reply | Threaded
Open this post in threaded view
|

[GitHub] [carbondata] CarbonDataQA2 commented on pull request #4121: [CARBONDATA-4170] Support dropping of parent complex columns(array/struct/map)

GitBox
In reply to this post by GitBox

CarbonDataQA2 commented on pull request #4121:
URL: https://github.com/apache/carbondata/pull/4121#issuecomment-825167862


   Build Failed  with Spark 2.4.5, Please check CI http://121.244.95.60:12602/job/ApacheCarbon_PR_Builder_2.4.5/3489/
   


--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[hidden email]


Reply | Threaded
Open this post in threaded view
|

[GitHub] [carbondata] akkio-97 commented on pull request #4121: [CARBONDATA-4170] Support dropping of parent complex columns(array/struct/map)

GitBox
In reply to this post by GitBox

akkio-97 commented on pull request #4121:
URL: https://github.com/apache/carbondata/pull/4121#issuecomment-825382199


   retest this please


--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[hidden email]


Reply | Threaded
Open this post in threaded view
|

[GitHub] [carbondata] Indhumathi27 commented on a change in pull request #4121: [CARBONDATA-4170] Support dropping of parent complex columns(array/struct/map)

GitBox
In reply to this post by GitBox

Indhumathi27 commented on a change in pull request #4121:
URL: https://github.com/apache/carbondata/pull/4121#discussion_r618936572



##########
File path: index/secondary-index/src/test/scala/org/apache/carbondata/spark/testsuite/secondaryindex/TestSIWithSecondaryIndex.scala
##########
@@ -62,17 +62,6 @@ class TestSIWithSecondaryIndex extends QueryTest with BeforeAndAfterAll {
     }
   }
 
-  test ("test alter drop all columns of the SI table") {
-    sql("create table table_drop_columns (" +
-        "name string, id string, country string) stored as carbondata")
-    sql("insert into table_drop_columns select 'xx', '1', 'china'")
-    sql("create index tdc_index_1 on table table_drop_columns(id, country) as 'carbondata'")

Review comment:
       why this testcase is removed?. you can modify existing by adding complex column also and verify, instead of adding new testcase




--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[hidden email]


Reply | Threaded
Open this post in threaded view
|

[GitHub] [carbondata] Indhumathi27 commented on a change in pull request #4121: [CARBONDATA-4170] Support dropping of parent complex columns(array/struct/map)

GitBox
In reply to this post by GitBox

Indhumathi27 commented on a change in pull request #4121:
URL: https://github.com/apache/carbondata/pull/4121#discussion_r618941521



##########
File path: integration/spark/src/main/scala/org/apache/spark/sql/secondaryindex/events/AlterTableDropColumnEventListener.scala
##########
@@ -74,15 +74,10 @@ class AlterTableDropColumnEventListener extends OperationEventListener {
       val indexColumns = indexTable._2.asScala(CarbonCommonConstants.INDEX_COLUMNS).split(",")
       val colSize = alterTableDropColumnModel.columns.intersect(indexColumns).size
       if (colSize > 0) {
-        if (colSize == indexColumns.size) {
-          indexTableToDrop ++= Seq(indexTable._1)
-        } else {
-          sys
-            .error(s"Index Table [${
-              indexTable._1
-            }] exists with combination of provided drop column(s) and other columns, drop " +
-                   s"index table & retry")
-        }
+        sys
+          .error(s"Index Table [${
+            indexTable._1
+          }] contains provided drop column(s), please drop index table & retry")
       }
     })
     indexTableToDrop.foreach { indexTable =>

Review comment:
       this variable is not used. Please remove those unused code changes and update this method name




--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[hidden email]


1234