Re: 回复: [DISCUSSION] Refactory on spark related modules
Posted by
ravipesala on
Dec 06, 2017; 7:26am
URL: http://apache-carbondata-dev-mailing-list-archive.168.s1.nabble.com/DISCUSSION-Refactory-on-spark-related-modules-tp29488p29519.html
Hi Jacky,
I don't think it's a good idea to create new modules for spark2.1 and
spark2.2 versions.We should not create a module for every spark minor
version. Earlier we had a modules spark and spark2 because of major version
change and a lot of interfaces are changed along with it. If it is a major
version we can create module but not for every minor version. And regarding
the IDE issue, it is just a developer understanding of how to switch the
versions so we can have FAQ for that and also we can check the solution for
IDE problem.
And about merging spark2 and spark-common module, There is no harm in
keeping all the RDD in common package as if you need to support any major
spark versions like spark3.0 in future may require separating the modules
again.
Regards,
Ravindra.
On 6 December 2017 at 11:00, wyphao.2007 <
[hidden email]> wrote:
--
Thanks & Regards,
Ravi