[jira] [Updated] (CARBONDATA-4061) Empty value for date and timestamp columns are reading as null when using SDK. if we pass empty value to data and timestamp columns ,it gives null pointer exception

classic Classic list List threaded Threaded
1 message Options
Reply | Threaded
Open this post in threaded view
|

[jira] [Updated] (CARBONDATA-4061) Empty value for date and timestamp columns are reading as null when using SDK. if we pass empty value to data and timestamp columns ,it gives null pointer exception

Akash R Nilugal (Jira)

     [ https://issues.apache.org/jira/browse/CARBONDATA-4061?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Chetan Bhat updated CARBONDATA-4061:
------------------------------------
    Summary: Empty value for date and timestamp columns are reading as null when using SDK. if we pass empty value to data and timestamp columns ,it gives null pointer exception  (was: Empty value for date and timestamp columns are reading as null . if we pass empty value to data and timestamp columns ,it gives null pointer exception)

> Empty value for date and timestamp columns are reading as null when using SDK. if we pass empty value to data and timestamp columns ,it gives null pointer exception
> --------------------------------------------------------------------------------------------------------------------------------------------------------------------
>
>                 Key: CARBONDATA-4061
>                 URL: https://issues.apache.org/jira/browse/CARBONDATA-4061
>             Project: CarbonData
>          Issue Type: Bug
>          Components: data-query, other
>    Affects Versions: 2.1.0
>         Environment: Spark 2.4.5
>            Reporter: Chetan Bhat
>            Priority: Minor
>
> Empty value for date and timestamp columns are reading as null . if we pass empty value to data and timestamp columns ,it gives null pointer exception
>  
> 2020-11-27 13:44:20 INFO ResultCollectorFactory:78 - Vector based dictionary collector is used to scan and collect the data
> 2020-11-27 13:44:20 INFO DictionaryBasedVectorResultCollector:73 - Direct page-wise vector fill collector is used to scan and collect the data
> java.lang.NullPointerException
>  at org.apache.carbondata.sdk.file.CarbonReader.formatDateAndTimeStamp(CarbonReader.java:153)
>  at org.apache.carbondata.sdk.file.CarbonReader.readNextRow(CarbonReader.java:126)
>  at com.apache.spark.SDKReaderTest.testSDKRederAll_data_types2(SDKReaderTest.java:239)
>  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>  at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>  at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>  at java.lang.reflect.Method.invoke(Method.java:498)
>  at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
>  at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
>  at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
>  at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
>  at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
>  at org.junit.rules.RunRules.evaluate(RunRules.java:20)
>  at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
>  at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
>  at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
>  at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
>  at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
>  at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
>  at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
>  at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
>  at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
>  at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
>  at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
>  at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
>  at com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:69)
>  at com.intellij.rt.junit.IdeaTestRunner$Repeater.startRunnerWithArgs(IdeaTestRunner.java:33)
>  at com.intellij.rt.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:220)
>  at com.intellij.rt.junit.JUnitStarter.main(JUnitStarter.java:53)
> 2020-11-27 13:44:20 INFO SparkContext:54 - Invoking stop() from shutdown hook
> 2020-11-27 13:44:20 INFO AbstractConnector:343 - Stopped [Spark@22175d4f{HTTP/1.1|mailto:Spark@22175d4f%7BHTTP/1.1],[http/1.1]}\{10.19.36.215:4040}
> 2020-11-27 13:44:20 INFO session:158 - node0 Stopped scavenging



--
This message was sent by Atlassian Jira
(v8.3.4#803005)