Login  Register

Re: 回复: [DISCUSSION] Refactory on spark related modules

Posted by ravipesala on Dec 06, 2017; 7:26am
URL: http://apache-carbondata-dev-mailing-list-archive.168.s1.nabble.com/DISCUSSION-Refactory-on-spark-related-modules-tp29488p29519.html

Hi Jacky,

I don't think it's a good idea to create new modules for spark2.1 and
spark2.2 versions.We should not create a module for every spark minor
version. Earlier we had a modules spark and spark2 because of major version
change and a lot of interfaces are changed along with it. If it is a major
version we can create module but not for every minor version. And regarding
the IDE issue, it is just a developer understanding of how to switch the
versions so we can have FAQ for that and also we can check the solution for
IDE problem.

And about merging spark2 and spark-common module, There is no harm in
keeping all the RDD in common package as if you need to support any major
spark versions like spark3.0 in future may require separating the modules
again.

Regards,
Ravindra.

On 6 December 2017 at 11:00, wyphao.2007 <[hidden email]> wrote:

> +1
>
>
>
>
>
>
> 在2017年12月06 11时44分, "岑玉海"<[hidden email]>写道:
>
> +1
>
>
>
>
>
>
> Best regards!
> Yuhai Cen
>
>
> 在2017年12月6日 11:43,David CaiQiang<[hidden email]> 写道:
> +1
>
>
>
> -----
> Best Regards
> David Cai
> --
> Sent from: http://apache-carbondata-dev-mailing-list-archive.1130556.
> n5.nabble.com/
>



--
Thanks & Regards,
Ravi