[
https://issues.apache.org/jira/browse/CARBONDATA-1829?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Ayushi Sharma resolved CARBONDATA-1829.
---------------------------------------
Resolution: Fixed
Fix Version/s: 1.3.0
BAD_RECORDS_ACTION is set to FAIL by default..
> Carbon 1.3.0 - Spark 2.2: Insert is passing when Hive is having Float and Carbon is having INT value and load file is having single precision decimal value
> -----------------------------------------------------------------------------------------------------------------------------------------------------------
>
> Key: CARBONDATA-1829
> URL:
https://issues.apache.org/jira/browse/CARBONDATA-1829> Project: CarbonData
> Issue Type: Bug
> Reporter: Ayushi Sharma
> Fix For: 1.3.0
>
> Attachments: Hive3.csv
>
>
> Steps:
> 1. create table Hive3(Sell_price FLOAT, Item_code STRING, Qty_total Double,Profit Decimal(4,3), Update_time TIMESTAMP )row format delimited fields terminated by ',' collection items terminated by '$'
> 2. create table Carbon3(Sell_price INT, Item_code STRING, Qty_total DECIMAL(3,1),Profit DECIMAL(3,2), Update_time TIMESTAMP ) STORED BY 'org.apache.carbondata.format'
> 3. load data LOCAL INPATH '/opt/csv/Data/InsertData/Hive3.csv' overwrite into table Hive3
> Issue:
> Insert is passing when Hive is having Float and Carbon is having INT value and load file is having single precision decimal value. This should be failed.
> Expected:
> It should be failed.
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)