Chetan Bhat created CARBONDATA-1034:
---------------------------------------
Summary: FilterUnsupportedException thrown for select from table where = filter for int column has negative of value larger than int max range
Key: CARBONDATA-1034
URL:
https://issues.apache.org/jira/browse/CARBONDATA-1034 Project: CarbonData
Issue Type: Bug
Components: data-query
Affects Versions: 1.1.0
Environment: 3 node cluster
Reporter: Chetan Bhat
Attachments: file.csv
In Beeline user creates a table and loads data in the table.
User executes select from table where = filter for int column has negative of value larger than int max range.
0: jdbc:hive2://172.168.100.199:23040> CREATE table mycube21 (column1 STRING, column2 STRING,column3 INT, column4 INT,column5 INT, column6 INT) stored by 'org.apache.carbondata.format' TBLPROPERTIES("columnproperties.column1.shared_column"="shared.column1","columnproperties.column2.shared_column"="shared.column2");
+---------+--+
| Result |
+---------+--+
+---------+--+
No rows selected (0.059 seconds)
0: jdbc:hive2://172.168.100.199:23040> LOAD DATA INPATH 'hdfs://hacluster/chetan/file.csv' INTO TABLE mycube21 OPTIONS('DELIMITER'=',','QUOTECHAR'='"','BAD_RECORDS_ACTION'='FORCE','FILEHEADER'='');
+---------+--+
| Result |
+---------+--+
+---------+--+
No rows selected (1.198 seconds)
0: jdbc:hive2://172.168.100.199:23040> select * from mycube21 where column4=-9223372036854775808;
Actual Result : FilterUnsupportedException thrown for select from table where = filter for int column has negative of value larger than int max range.
0: jdbc:hive2://172.168.100.199:23040> select * from mycube21 where column4=-9223372036854775808;
Error: org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 936.0 failed 4 times, most recent failure: Lost task 0.3 in stage 936.0 (TID 42603, linux-53, executor 1): org.apache.spark.util.TaskCompletionListenerException: java.util.concurrent.ExecutionException: org.apache.carbondata.core.scan.expression.exception.FilterUnsupportedException: java.lang.Long cannot be cast to java.lang.Integer
at org.apache.spark.TaskContextImpl.markTaskCompleted(TaskContextImpl.scala:105)
at org.apache.spark.scheduler.Task.run(Task.scala:112)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:282)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Driver stacktrace: (state=,code=0)
Expected Result : Exception should not be thrown. Select query should return correct result set (0 rows).
--
This message was sent by Atlassian JIRA
(v6.3.15#6346)