[GitHub] carbondata pull request #2175: [CARBONDATA-2352] Added SDV test cases for Pa...

classic Classic list List threaded Threaded
40 messages Options
12
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata issue #2175: [CARBONDATA-2352] Added SDV test cases for Partition...

qiuchenjian-2
Github user ravipesala commented on the issue:

    https://github.com/apache/carbondata/pull/2175
 
    SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/4631/



---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata issue #2175: [CARBONDATA-2352] Added SDV test cases for Partition...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user kunal642 commented on the issue:

    https://github.com/apache/carbondata/pull/2175
 
    retest this please


---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata issue #2175: [CARBONDATA-2352] Added SDV test cases for Partition...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:

    https://github.com/apache/carbondata/pull/2175
 
    Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/4412/



---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata issue #2175: [CARBONDATA-2352] Added SDV test cases for Partition...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:

    https://github.com/apache/carbondata/pull/2175
 
    Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/5573/



---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata issue #2175: [CARBONDATA-2352] Added SDV test cases for Partition...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user kunal642 commented on the issue:

    https://github.com/apache/carbondata/pull/2175
 
    retest sdv please


---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata issue #2175: [CARBONDATA-2352] Added SDV test cases for Partition...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user kunal642 commented on the issue:

    https://github.com/apache/carbondata/pull/2175
 
    @praveenmeenakshi56 Please write proper description


---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata issue #2175: [CARBONDATA-2352] Added SDV test cases for Partition...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:

    https://github.com/apache/carbondata/pull/2175
 
    SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/4782/



---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata pull request #2175: [CARBONDATA-2352] Added SDV test cases for Pa...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user kunal642 commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/2175#discussion_r186616811
 
    --- Diff: integration/spark-common-cluster-test/src/test/scala/org/apache/carbondata/cluster/sdv/generated/PartitionWithPreAggregateTestCase.scala ---
    @@ -0,0 +1,129 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *    http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +package org.apache.carbondata.cluster.sdv.generated
    +import org.apache.carbondata.core.constants.CarbonCommonConstants
    +import org.apache.carbondata.core.util.CarbonProperties
    +import org.apache.spark.sql.Row
    +import org.apache.spark.sql.common.util._
    +import org.scalatest.BeforeAndAfterAll
    +/**
    +  * Test Class for Support of Partition with PreAggregate table
    +  */
    +class PartitionWithPreAggregateTestCase extends QueryTest with BeforeAndAfterAll {
    +  override def beforeAll = {
    +    CarbonProperties.getInstance()
    +      .addProperty(CarbonCommonConstants.CARBON_TIMESTAMP_FORMAT, "yyyy/MM/dd HH:mm:ss")
    +      .addProperty(CarbonCommonConstants.CARBON_DATE_FORMAT, "yyyy/MM/dd")
    +  }
    +  //Loading data into partitioned table with SORT_SCOPE=LOCAL_SORT
    +  test("Partition-With-PreAggregate_TC001", Include) {
    +    sql(s"""drop table if exists partition_table""")
    +    sql(s"""CREATE TABLE partition_table(shortField SHORT, intField INT, bigintField LONG, doubleField DOUBLE, timestamp TIMESTAMP, decimalField DECIMAL(18,2),dateField DATE, charField CHAR(5), floatField FLOAT ) PARTITIONED BY (stringField STRING) STORED BY 'carbondata' TBLPROPERTIES('SORT_SCOPE'='LOCAL_SORT')""")
    +    sql(s"""load data inpath '$resourcesPath/Data/partition/list_partition_table.csv' into table partition_table""")
    +    sql("create datamap ag1 on table partition_table using 'preaggregate' as select shortField, sum(intField) from partition_table group by shortField")
    +    checkAnswer(sql(s"""select count(*) from partition_table"""), Seq(Row(10)))
    --- End diff --
   
    Please compare actual results instead of count


---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata pull request #2175: [CARBONDATA-2352] Added SDV test cases for Pa...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user kunal642 commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/2175#discussion_r186616874
 
    --- Diff: integration/spark-common-cluster-test/src/test/scala/org/apache/carbondata/cluster/sdv/generated/PartitionWithPreAggregateTestCase.scala ---
    @@ -0,0 +1,129 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *    http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +package org.apache.carbondata.cluster.sdv.generated
    +import org.apache.carbondata.core.constants.CarbonCommonConstants
    +import org.apache.carbondata.core.util.CarbonProperties
    +import org.apache.spark.sql.Row
    +import org.apache.spark.sql.common.util._
    +import org.scalatest.BeforeAndAfterAll
    +/**
    +  * Test Class for Support of Partition with PreAggregate table
    +  */
    +class PartitionWithPreAggregateTestCase extends QueryTest with BeforeAndAfterAll {
    +  override def beforeAll = {
    +    CarbonProperties.getInstance()
    +      .addProperty(CarbonCommonConstants.CARBON_TIMESTAMP_FORMAT, "yyyy/MM/dd HH:mm:ss")
    +      .addProperty(CarbonCommonConstants.CARBON_DATE_FORMAT, "yyyy/MM/dd")
    +  }
    +  //Loading data into partitioned table with SORT_SCOPE=LOCAL_SORT
    +  test("Partition-With-PreAggregate_TC001", Include) {
    +    sql(s"""drop table if exists partition_table""")
    +    sql(s"""CREATE TABLE partition_table(shortField SHORT, intField INT, bigintField LONG, doubleField DOUBLE, timestamp TIMESTAMP, decimalField DECIMAL(18,2),dateField DATE, charField CHAR(5), floatField FLOAT ) PARTITIONED BY (stringField STRING) STORED BY 'carbondata' TBLPROPERTIES('SORT_SCOPE'='LOCAL_SORT')""")
    +    sql(s"""load data inpath '$resourcesPath/Data/partition/list_partition_table.csv' into table partition_table""")
    +    sql("create datamap ag1 on table partition_table using 'preaggregate' as select shortField, sum(intField) from partition_table group by shortField")
    +    checkAnswer(sql(s"""select count(*) from partition_table"""), Seq(Row(10)))
    +  }
    +  //Loading data into partitioned table with SORT_SCOPE=GLOBAL_SORT
    +  test("Partition-With-PreAggregate_TC002", Include) {
    +    sql(s"""drop table if exists partition_table""")
    +    sql(s"""CREATE TABLE partition_table(shortField SHORT, intField INT, bigintField LONG, doubleField DOUBLE, timestamp TIMESTAMP, decimalField DECIMAL(18,2),dateField DATE, charField CHAR(5), floatField FLOAT ) PARTITIONED BY (stringField STRING) STORED BY 'carbondata' TBLPROPERTIES('SORT_SCOPE'='GLOBAL_SORT')""")
    +    sql(s"""load data inpath '$resourcesPath/Data/partition/list_partition_table.csv' into table partition_table""")
    +    sql("create datamap ag1 on table partition_table using 'preaggregate' as select shortField, sum(intField) from partition_table group by shortField")
    +    checkAnswer(sql(s"""select count(*) from partition_table"""), Seq(Row(10)))
    +  }
    +  //Loading data into partitioned table with SORT_SCOPE=BATCH_SORT
    +  test("Partition-With-PreAggregate_TC003", Include) {
    +    sql(s"""drop table if exists partition_table""")
    +    sql(s"""CREATE TABLE partition_table(shortField SHORT, intField INT, bigintField LONG, doubleField DOUBLE, timestamp TIMESTAMP, decimalField DECIMAL(18,2),dateField DATE, charField CHAR(5), floatField FLOAT ) PARTITIONED BY (stringField STRING) STORED BY 'carbondata' TBLPROPERTIES('SORT_SCOPE'='BATCH_SORT')""")
    +    sql(s"""load data inpath '$resourcesPath/Data/partition/list_partition_table.csv' into table partition_table""")
    +    sql("create datamap ag1 on table partition_table using 'preaggregate' as select shortField, sum(intField) from partition_table group by shortField")
    +    checkAnswer(sql(s"""select count(*) from partition_table"""), Seq(Row(10)))
    +  }
    +  //Loading data into partitioned table with SORT_SCOPE=NO_SORT
    +  test("Partition-With-PreAggregate_TC004", Include) {
    +    sql(s"""drop table if exists partition_table""")
    +    sql(s"""CREATE TABLE partition_table(shortField SHORT, intField INT, bigintField LONG, doubleField DOUBLE, timestamp TIMESTAMP, decimalField DECIMAL(18,2),dateField DATE, charField CHAR(5), floatField FLOAT ) PARTITIONED BY (stringField STRING) STORED BY 'carbondata' TBLPROPERTIES('SORT_SCOPE'='NO_SORT')""")
    +    sql(s"""load data inpath '$resourcesPath/Data/partition/list_partition_table.csv' into table partition_table""")
    +    sql("create datamap ag1 on table partition_table using 'preaggregate' as select shortField, sum(intField) from partition_table group by shortField")
    +    checkAnswer(sql(s"""select count(*) from partition_table"""), Seq(Row(10)))
    +  }
    +  //Verify Aggregation query on Loaded Partition table with Pre-Aggregate table
    +  test("Partition-With-PreAggregate_TC005", Include) {
    +    sql(s"""drop table if exists partition_table""")
    +    sql(s"""CREATE TABLE partition_table(shortField SHORT, intField INT, bigintField LONG, doubleField DOUBLE, timestamp TIMESTAMP, decimalField DECIMAL(18,2),dateField DATE, charField CHAR(5), floatField FLOAT ) PARTITIONED BY (stringField STRING) STORED BY 'carbondata'""")
    +    sql("create datamap ag1 on table partition_table using 'preaggregate' as select shortField, sum(intField) from partition_table group by shortField")
    +    sql(s"""load data inpath '$resourcesPath/Data/partition/list_partition_table.csv' into table partition_table""")
    +    assert(sql("explain select shortField, sum(intField) from partition_table group by shortField").collect().head.get(0).toString.contains("partition_table_ag1"))
    +  }
    +  //Verify load with Pre_Aggregate table on Partition table after compaction
    +  test("Partition-With-PreAggregate_TC006", Include) {
    +    sql(s"""drop table if exists uniqdata""")
    +    sql(s"""CREATE TABLE uniqdata (CUST_NAME String,ACTIVE_EMUI_VERSION string, DOB timestamp, BIGINT_COLUMN1 bigint,BIGINT_COLUMN2 bigint,DECIMAL_COLUMN1 decimal(30,10), DECIMAL_COLUMN2 decimal(36,10),Double_COLUMN1 double, Double_COLUMN2 double,INTEGER_COLUMN1 int, DOJ timestamp) PARTITIONED BY (CUST_ID int) STORED BY 'carbondata'""")
    +    sql(s"""LOAD DATA INPATH  '$resourcesPath/Data/partition/2000_UniqData_partition.csv' into table uniqdata partition(CUST_ID='1') OPTIONS('DELIMITER'=',','BAD_RECORDS_ACTION'='FORCE','QUOTECHAR'='"','FILEHEADER'='CUST_NAME,ACTIVE_EMUI_VERSION,DOB,BIGINT_COLUMN1,BIGINT_COLUMN2,DECIMAL_COLUMN1,DECIMAL_COLUMN2,Double_COLUMN1,Double_COLUMN2,INTEGER_COLUMN1,DOJ,CUST_ID')""")
    +    sql(s"""LOAD DATA INPATH  '$resourcesPath/Data/partition/2000_UniqData_partition.csv' into table uniqdata partition(CUST_ID='1') OPTIONS('DELIMITER'=',','BAD_RECORDS_ACTION'='FORCE','QUOTECHAR'='"','FILEHEADER'='CUST_NAME,ACTIVE_EMUI_VERSION,DOB,BIGINT_COLUMN1,BIGINT_COLUMN2,DECIMAL_COLUMN1,DECIMAL_COLUMN2,Double_COLUMN1,Double_COLUMN2,INTEGER_COLUMN1,DOJ,CUST_ID')""")
    +    sql(s"""LOAD DATA INPATH  '$resourcesPath/Data/partition/2000_UniqData_partition.csv' into table uniqdata partition(CUST_ID='1') OPTIONS('DELIMITER'=',','BAD_RECORDS_ACTION'='FORCE','QUOTECHAR'='"','FILEHEADER'='CUST_NAME,ACTIVE_EMUI_VERSION,DOB,BIGINT_COLUMN1,BIGINT_COLUMN2,DECIMAL_COLUMN1,DECIMAL_COLUMN2,Double_COLUMN1,Double_COLUMN2,INTEGER_COLUMN1,DOJ,CUST_ID')""")
    +    sql(s"""LOAD DATA INPATH  '$resourcesPath/Data/partition/2000_UniqData_partition.csv' into table uniqdata partition(CUST_ID='1') OPTIONS('DELIMITER'=',','BAD_RECORDS_ACTION'='FORCE','QUOTECHAR'='"','FILEHEADER'='CUST_NAME,ACTIVE_EMUI_VERSION,DOB,BIGINT_COLUMN1,BIGINT_COLUMN2,DECIMAL_COLUMN1,DECIMAL_COLUMN2,Double_COLUMN1,Double_COLUMN2,INTEGER_COLUMN1,DOJ,CUST_ID')""")
    +    sql("create datamap ag1 on table uniqdata using 'preaggregate' as select cust_id, sum(Double_COLUMN1) from uniqdata group by cust_id")
    +    sql(s"""alter table uniqdata compact 'minor'""")
    +    assert(sql("show segments for table uniqdata").collect().tail(2).get(0).toString.equals("0.1"))
    +  }
    +  //Verify load with Pre_Aggregate table on Partition table after two-level compaction
    +  test("Partition-With-PreAggregate_TC007", Include) {
    +    sql(s"""drop table if exists uniqdata""")
    +    sql(s"""CREATE TABLE uniqdata (CUST_NAME String,ACTIVE_EMUI_VERSION string, DOB timestamp, BIGINT_COLUMN1 bigint,BIGINT_COLUMN2 bigint,DECIMAL_COLUMN1 decimal(30,10), DECIMAL_COLUMN2 decimal(36,10),Double_COLUMN1 double, Double_COLUMN2 double,INTEGER_COLUMN1 int, DOJ timestamp) PARTITIONED BY (CUST_ID int) STORED BY 'carbondata'""")
    +    sql(s"""LOAD DATA INPATH  '$resourcesPath/Data/partition/2000_UniqData_partition.csv' into table uniqdata partition(CUST_ID='1') OPTIONS('DELIMITER'=',','BAD_RECORDS_ACTION'='FORCE','QUOTECHAR'='"','FILEHEADER'='CUST_NAME,ACTIVE_EMUI_VERSION,DOB,BIGINT_COLUMN1,BIGINT_COLUMN2,DECIMAL_COLUMN1,DECIMAL_COLUMN2,Double_COLUMN1,Double_COLUMN2,INTEGER_COLUMN1,DOJ,CUST_ID')""")
    +    sql(s"""LOAD DATA INPATH  '$resourcesPath/Data/partition/2000_UniqData_partition.csv' into table uniqdata partition(CUST_ID='1') OPTIONS('DELIMITER'=',','BAD_RECORDS_ACTION'='FORCE','QUOTECHAR'='"','FILEHEADER'='CUST_NAME,ACTIVE_EMUI_VERSION,DOB,BIGINT_COLUMN1,BIGINT_COLUMN2,DECIMAL_COLUMN1,DECIMAL_COLUMN2,Double_COLUMN1,Double_COLUMN2,INTEGER_COLUMN1,DOJ,CUST_ID')""")
    +    sql(s"""LOAD DATA INPATH  '$resourcesPath/Data/partition/2000_UniqData_partition.csv' into table uniqdata partition(CUST_ID='1') OPTIONS('DELIMITER'=',','BAD_RECORDS_ACTION'='FORCE','QUOTECHAR'='"','FILEHEADER'='CUST_NAME,ACTIVE_EMUI_VERSION,DOB,BIGINT_COLUMN1,BIGINT_COLUMN2,DECIMAL_COLUMN1,DECIMAL_COLUMN2,Double_COLUMN1,Double_COLUMN2,INTEGER_COLUMN1,DOJ,CUST_ID')""")
    +    sql(s"""LOAD DATA INPATH  '$resourcesPath/Data/partition/2000_UniqData_partition.csv' into table uniqdata partition(CUST_ID='1') OPTIONS('DELIMITER'=',','BAD_RECORDS_ACTION'='FORCE','QUOTECHAR'='"','FILEHEADER'='CUST_NAME,ACTIVE_EMUI_VERSION,DOB,BIGINT_COLUMN1,BIGINT_COLUMN2,DECIMAL_COLUMN1,DECIMAL_COLUMN2,Double_COLUMN1,Double_COLUMN2,INTEGER_COLUMN1,DOJ,CUST_ID')""")
    +    sql(s"""LOAD DATA INPATH  '$resourcesPath/Data/partition/2000_UniqData_partition.csv' into table uniqdata partition(CUST_ID='1') OPTIONS('DELIMITER'=',','BAD_RECORDS_ACTION'='FORCE','QUOTECHAR'='"','FILEHEADER'='CUST_NAME,ACTIVE_EMUI_VERSION,DOB,BIGINT_COLUMN1,BIGINT_COLUMN2,DECIMAL_COLUMN1,DECIMAL_COLUMN2,Double_COLUMN1,Double_COLUMN2,INTEGER_COLUMN1,DOJ,CUST_ID')""")
    +    sql(s"""LOAD DATA INPATH  '$resourcesPath/Data/partition/2000_UniqData_partition.csv' into table uniqdata partition(CUST_ID='1') OPTIONS('DELIMITER'=',','BAD_RECORDS_ACTION'='FORCE','QUOTECHAR'='"','FILEHEADER'='CUST_NAME,ACTIVE_EMUI_VERSION,DOB,BIGINT_COLUMN1,BIGINT_COLUMN2,DECIMAL_COLUMN1,DECIMAL_COLUMN2,Double_COLUMN1,Double_COLUMN2,INTEGER_COLUMN1,DOJ,CUST_ID')""")
    +    sql(s"""LOAD DATA INPATH  '$resourcesPath/Data/partition/2000_UniqData_partition.csv' into table uniqdata partition(CUST_ID='1') OPTIONS('DELIMITER'=',','BAD_RECORDS_ACTION'='FORCE','QUOTECHAR'='"','FILEHEADER'='CUST_NAME,ACTIVE_EMUI_VERSION,DOB,BIGINT_COLUMN1,BIGINT_COLUMN2,DECIMAL_COLUMN1,DECIMAL_COLUMN2,Double_COLUMN1,Double_COLUMN2,INTEGER_COLUMN1,DOJ,CUST_ID')""")
    +    sql(s"""LOAD DATA INPATH  '$resourcesPath/Data/partition/2000_UniqData_partition.csv' into table uniqdata partition(CUST_ID='1') OPTIONS('DELIMITER'=',','BAD_RECORDS_ACTION'='FORCE','QUOTECHAR'='"','FILEHEADER'='CUST_NAME,ACTIVE_EMUI_VERSION,DOB,BIGINT_COLUMN1,BIGINT_COLUMN2,DECIMAL_COLUMN1,DECIMAL_COLUMN2,Double_COLUMN1,Double_COLUMN2,INTEGER_COLUMN1,DOJ,CUST_ID')""")
    +    sql("create datamap ag1 on table uniqdata using 'preaggregate' as select cust_id, sum(Double_COLUMN1) from uniqdata group by cust_id")
    +    sql(s"""alter table uniqdata compact 'minor'""")
    +    sql(s"""alter table uniqdata compact 'major'""")
    +    assert(sql("show segments for table uniqdata").collect().tail(7).get(0).toString.equals("0.2"))
    +  }
    +  //Verify join operation on Partitonwith Pre-Aggregate table
    +  test("Partition-With-PreAggregate_TC008", Include) {
    +    sql(s"""drop table if exists uniqdata""")
    +    sql(s"""drop table if exists uniqdata1""")
    +    sql(s"""CREATE TABLE uniqdata (CUST_NAME String,ACTIVE_EMUI_VERSION string, DOB timestamp, BIGINT_COLUMN1 bigint,BIGINT_COLUMN2 bigint,DECIMAL_COLUMN1 decimal(30,10), DECIMAL_COLUMN2 decimal(36,10),Double_COLUMN1 double, Double_COLUMN2 double,INTEGER_COLUMN1 int, DOJ timestamp) PARTITIONED BY (CUST_ID int) STORED BY 'carbondata'""")
    +    sql(s"""CREATE TABLE uniqdata1 (CUST_NAME String,ACTIVE_EMUI_VERSION string, DOB timestamp, BIGINT_COLUMN1 bigint,BIGINT_COLUMN2 bigint,DECIMAL_COLUMN1 decimal(30,10), DECIMAL_COLUMN2 decimal(36,10),Double_COLUMN1 double, Double_COLUMN2 double,INTEGER_COLUMN1 int, DOJ timestamp) PARTITIONED BY (CUST_ID int) STORED BY 'carbondata'""")
    +    sql(s"""LOAD DATA INPATH  '$resourcesPath/Data/partition/2000_UniqData_partition.csv' into table uniqdata partition(CUST_ID='1') OPTIONS('DELIMITER'=',','BAD_RECORDS_ACTION'='FORCE','QUOTECHAR'='"','FILEHEADER'='CUST_NAME,ACTIVE_EMUI_VERSION,DOB,BIGINT_COLUMN1,BIGINT_COLUMN2,DECIMAL_COLUMN1,DECIMAL_COLUMN2,Double_COLUMN1,Double_COLUMN2,INTEGER_COLUMN1,DOJ,CUST_ID')""")
    +    sql(s"""LOAD DATA INPATH  '$resourcesPath/Data/partition/2000_UniqData_partition.csv' into table uniqdata1 partition(CUST_ID='1') OPTIONS('DELIMITER'=',','BAD_RECORDS_ACTION'='FORCE','QUOTECHAR'='"','FILEHEADER'='CUST_NAME,ACTIVE_EMUI_VERSION,DOB,BIGINT_COLUMN1,BIGINT_COLUMN2,DECIMAL_COLUMN1,DECIMAL_COLUMN2,Double_COLUMN1,Double_COLUMN2,INTEGER_COLUMN1,DOJ,CUST_ID')""")
    +    sql("create datamap ag1 on table uniqdata using 'preaggregate' as select cust_id, sum(Double_COLUMN1) from uniqdata group by cust_id")
    +    sql("create datamap ag1 on table uniqdata1 using 'preaggregate' as select cust_id, sum(Double_COLUMN1) from uniqdata1 group by cust_id")
    +    checkAnswer(sql(s""" select a.cust_id, b.cust_id from uniqdata a, uniqdata1 b where a.cust_id >= b.cust_id limit 1"""),Seq(Row(1,1)))
    +    sql(s"""drop table if exists uniqdata1""")
    --- End diff --
   
    drop uniqdata1 in afterAll also


---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata issue #2175: [CARBONDATA-2352] Added SDV test cases for Partition...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user praveenmeenakshi56 commented on the issue:

    https://github.com/apache/carbondata/pull/2175
 
    retest this please


---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata issue #2175: [CARBONDATA-2352] Added SDV test cases for Partition...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:

    https://github.com/apache/carbondata/pull/2175
 
    Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/4582/



---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata issue #2175: [CARBONDATA-2352] Added SDV test cases for Partition...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:

    https://github.com/apache/carbondata/pull/2175
 
    Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/5746/



---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata issue #2175: [CARBONDATA-2352] Added SDV test cases for Partition...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:

    https://github.com/apache/carbondata/pull/2175
 
    Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/4588/



---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata issue #2175: [CARBONDATA-2352] Added SDV test cases for Partition...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user praveenmeenakshi56 commented on the issue:

    https://github.com/apache/carbondata/pull/2175
 
    retest SDV please


---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata issue #2175: [CARBONDATA-2352] Added SDV test cases for Partition...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:

    https://github.com/apache/carbondata/pull/2175
 
    SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/4821/



---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata issue #2175: [CARBONDATA-2352] Added SDV test cases for Partition...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user praveenmeenakshi56 commented on the issue:

    https://github.com/apache/carbondata/pull/2175
 
    retest SDV please


---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata issue #2175: [CARBONDATA-2352] Added SDV test cases for Partition...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user praveenmeenakshi56 commented on the issue:

    https://github.com/apache/carbondata/pull/2175
 
    retest SDV please


---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata issue #2175: [CARBONDATA-2352] Added SDV test cases for Partition...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:

    https://github.com/apache/carbondata/pull/2175
 
    SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/4831/



---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata issue #2175: [CARBONDATA-2352] Added SDV test cases for Partition...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user kunal642 commented on the issue:

    https://github.com/apache/carbondata/pull/2175
 
    LGTM


---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata pull request #2175: [CARBONDATA-2352] Added SDV test cases for Pa...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user asfgit closed the pull request at:

    https://github.com/apache/carbondata/pull/2175


---
12