[Disscussion] Change Default TimeStampFormat to yyyy-mm-dd hh:mm:ss.SSS

classic Classic list List threaded Threaded
2 messages Options
Reply | Threaded
Open this post in threaded view
|

[Disscussion] Change Default TimeStampFormat to yyyy-mm-dd hh:mm:ss.SSS

haomarch
Spark's default TimeStampFormat  is yyyy-mm-dd hh:mm:ss.SSS
CarbonData shall keep consistent with Spark.



--
Sent from: http://apache-carbondata-dev-mailing-list-archive.1130556.n5.nabble.com/
Reply | Threaded
Open this post in threaded view
|

Re: [Disscussion] Change Default TimeStampFormat to yyyy-mm-dd hh:mm:ss.SSS

Ajantha Bhat
Hi,
I Need to check below points before concluding on it. If you already have
information on this, you can provide me.

1. About hive and spark default format; some place they mention upto 9
decimal precision. you mentioned 3 decimal precision.
so, which file of hive and spark has this default value ?
2. Why current test cases are not failing when we compare query results
from hive and carbon for timestamp column ?
3. Also after we change it, how many testcase need to modify? [because the
validation value may not match] ?

*As this value is configurable.* I am *neutral* about the changes proposed
If the effort is high.

Thanks,
Ajantha

On Tue, Jul 14, 2020 at 2:05 PM haomarch <[hidden email]> wrote:

> Spark's default TimeStampFormat  is yyyy-mm-dd hh:mm:ss.SSS
> CarbonData shall keep consistent with Spark.
>
>
>
> --
> Sent from:
> http://apache-carbondata-dev-mailing-list-archive.1130556.n5.nabble.com/
>