> Exception when to load data using static partition for uniqdata table
> ---------------------------------------------------------------------
>
> Key: CARBONDATA-1974
> URL:
https://issues.apache.org/jira/browse/CARBONDATA-1974> Project: CarbonData
> Issue Type: Bug
> Components: spark-integration
> Affects Versions: 1.3.0
> Environment: spark2.1,hadoop2.7
> Reporter: anubhav tarar
> Priority: Major
> Fix For: 1.3.0
>
>
> 1.CREATE TABLE uniqdata_string(CUST_ID int,CUST_NAME String,DOB timestamp,DOJ timestamp, BIGINT_COLUMN1 bigint,BIGINT_COLUMN2 bigint,DECIMAL_COLUMN1 decimal(30,10),DECIMAL_COLUMN2 decimal(36,10),Double_COLUMN1 double, Double_COLUMN2 double,INTEGER_COLUMN1 int) PARTITIONED BY(ACTIVE_EMUI_VERSION string) STORED BY 'org.apache.carbondata.format' TBLPROPERTIES ('TABLE_BLOCKSIZE'= '256 MB');
> 2,jdbc:hive2://localhost:10000/default> LOAD DATA INPATH 'hdfs://localhost:54311/2000_UniqData.csv' into table uniqdata_string partition(ACTIVE_EMUI_VERSION='abc') OPTIONS('FILEHEADER'='CUST_ID,CUST_NAME,ACTIVE_EMUI_VERSION,DOB,DOJ, BIGINT_COLUMN1,BIGINT_COLUMN2,DECIMAL_COLUMN1,DECIMAL_COLUMN2,Double_COLUMN1, Double_COLUMN2,INTEGER_COLUMN1','BAD_RECORDS_ACTION'='FORCE');
> Error: org.apache.spark.sql.AnalysisException: Cannot insert into table `default`.`uniqdata_string` because the number of columns are different: need 11 columns, but query has 12 columns.; (state=,code=0)