[DISCUSSION] Refactory on spark related modules

classic Classic list List threaded Threaded
11 messages Options
Reply | Threaded
Open this post in threaded view
|

[DISCUSSION] Refactory on spark related modules

Jacky Li
Hi community,

Currently we have spark-common and spark2 module, and inside spark2 module there are spark2.1 folder and spark2.2 folder.
This creates problems for IDE for developer, and it spark-common is not required since we have drop spark 1.5 support.
So I suggest we:
1. Merge spark2 and spark-common module, name it spark2 module
2. Put spark2.1 folder to a new module called spark2.1  (It mainly holds CarbonSessionsState)
3. Put spark2.2 folder to a new module called spark2.2  (It mainly holds CarbonSessionsState)

When building carbon, user can specify profile to select modules to include for building
And since CarbonSessionState is created by reflection in spark2 module, so this approach will not have cyclic dependency problem

Regards,
Jacky

Reply | Threaded
Open this post in threaded view
|

Re: [DISCUSSION] Refactory on spark related modules

David CaiQiang
+1



-----
Best Regards
David Cai
--
Sent from: http://apache-carbondata-dev-mailing-list-archive.1130556.n5.nabble.com/
Best Regards
David Cai
Reply | Threaded
Open this post in threaded view
|

回复: [DISCUSSION] Refactory on spark related modules

cenyuhai11
+1






Best regards!
Yuhai Cen


在2017年12月6日 11:43,David CaiQiang<[hidden email]> 写道:
+1



-----
Best Regards
David Cai
--
Sent from: http://apache-carbondata-dev-mailing-list-archive.1130556.n5.nabble.com/
Reply | Threaded
Open this post in threaded view
|

Re:回复: [DISCUSSION] Refactory on spark related modules

wyphao.2007
+1






在2017年12月06 11时44分, "岑玉海"<[hidden email]>写道:

+1






Best regards!
Yuhai Cen


在2017年12月6日 11:43,David CaiQiang<[hidden email]> 写道:
+1



-----
Best Regards
David Cai
--
Sent from: http://apache-carbondata-dev-mailing-list-archive.1130556.n5.nabble.com/
Reply | Threaded
Open this post in threaded view
|

Re: 回复: [DISCUSSION] Refactory on spark related modules

ravipesala
Hi Jacky,

I don't think it's a good idea to create new modules for spark2.1 and
spark2.2 versions.We should not create a module for every spark minor
version. Earlier we had a modules spark and spark2 because of major version
change and a lot of interfaces are changed along with it. If it is a major
version we can create module but not for every minor version. And regarding
the IDE issue, it is just a developer understanding of how to switch the
versions so we can have FAQ for that and also we can check the solution for
IDE problem.

And about merging spark2 and spark-common module, There is no harm in
keeping all the RDD in common package as if you need to support any major
spark versions like spark3.0 in future may require separating the modules
again.

Regards,
Ravindra.

On 6 December 2017 at 11:00, wyphao.2007 <[hidden email]> wrote:

> +1
>
>
>
>
>
>
> 在2017年12月06 11时44分, "岑玉海"<[hidden email]>写道:
>
> +1
>
>
>
>
>
>
> Best regards!
> Yuhai Cen
>
>
> 在2017年12月6日 11:43,David CaiQiang<[hidden email]> 写道:
> +1
>
>
>
> -----
> Best Regards
> David Cai
> --
> Sent from: http://apache-carbondata-dev-mailing-list-archive.1130556.
> n5.nabble.com/
>



--
Thanks & Regards,
Ravi
Reply | Threaded
Open this post in threaded view
|

Re: 回复: [DISCUSSION] Refactory on spark related modules

Sangeeta Gulia
+1 to Ravindra's suggestion.

On Wed, Dec 6, 2017 at 12:56 PM, Ravindra Pesala <[hidden email]>
wrote:

> Hi Jacky,
>
> I don't think it's a good idea to create new modules for spark2.1 and
> spark2.2 versions.We should not create a module for every spark minor
> version. Earlier we had a modules spark and spark2 because of major version
> change and a lot of interfaces are changed along with it. If it is a major
> version we can create module but not for every minor version. And regarding
> the IDE issue, it is just a developer understanding of how to switch the
> versions so we can have FAQ for that and also we can check the solution for
> IDE problem.
>
> And about merging spark2 and spark-common module, There is no harm in
> keeping all the RDD in common package as if you need to support any major
> spark versions like spark3.0 in future may require separating the modules
> again.
>
> Regards,
> Ravindra.
>
> On 6 December 2017 at 11:00, wyphao.2007 <[hidden email]> wrote:
>
> > +1
> >
> >
> >
> >
> >
> >
> > 在2017年12月06 11时44分, "岑玉海"<[hidden email]>写道:
> >
> > +1
> >
> >
> >
> >
> >
> >
> > Best regards!
> > Yuhai Cen
> >
> >
> > 在2017年12月6日 11:43,David CaiQiang<[hidden email]> 写道:
> > +1
> >
> >
> >
> > -----
> > Best Regards
> > David Cai
> > --
> > Sent from: http://apache-carbondata-dev-mailing-list-archive.1130556.
> > n5.nabble.com/
> >
>
>
>
> --
> Thanks & Regards,
> Ravi
>



--
Warm Regards,

Sangeeta Gulia
Software Consultant
Knoldus Software LLP
Reply | Threaded
Open this post in threaded view
|

Re: 回复: [DISCUSSION] Refactory on spark related modules

phalodi
+1 agreed with ravindra

-Sandeep

On Thu, Dec 7, 2017 at 5:26 AM, Sangeeta Gulia <[hidden email]>
wrote:

> +1 to Ravindra's suggestion.
>
> On Wed, Dec 6, 2017 at 12:56 PM, Ravindra Pesala <[hidden email]>
> wrote:
>
> > Hi Jacky,
> >
> > I don't think it's a good idea to create new modules for spark2.1 and
> > spark2.2 versions.We should not create a module for every spark minor
> > version. Earlier we had a modules spark and spark2 because of major
> version
> > change and a lot of interfaces are changed along with it. If it is a
> major
> > version we can create module but not for every minor version. And
> regarding
> > the IDE issue, it is just a developer understanding of how to switch the
> > versions so we can have FAQ for that and also we can check the solution
> for
> > IDE problem.
> >
> > And about merging spark2 and spark-common module, There is no harm in
> > keeping all the RDD in common package as if you need to support any major
> > spark versions like spark3.0 in future may require separating the modules
> > again.
> >
> > Regards,
> > Ravindra.
> >
> > On 6 December 2017 at 11:00, wyphao.2007 <[hidden email]> wrote:
> >
> > > +1
> > >
> > >
> > >
> > >
> > >
> > >
> > > 在2017年12月06 11时44分, "岑玉海"<[hidden email]>写道:
> > >
> > > +1
> > >
> > >
> > >
> > >
> > >
> > >
> > > Best regards!
> > > Yuhai Cen
> > >
> > >
> > > 在2017年12月6日 11:43,David CaiQiang<[hidden email]> 写道:
> > > +1
> > >
> > >
> > >
> > > -----
> > > Best Regards
> > > David Cai
> > > --
> > > Sent from: http://apache-carbondata-dev-mailing-list-archive.1130556.
> > > n5.nabble.com/
> > >
> >
> >
> >
> > --
> > Thanks & Regards,
> > Ravi
> >
>
>
>
> --
> Warm Regards,
>
> Sangeeta Gulia
> Software Consultant
> Knoldus Software LLP
>
Reply | Threaded
Open this post in threaded view
|

Re: 回复: [DISCUSSION] Refactory on spark related modules

xubo245
+1




------------------ Original ------------------
From:  "Sandeep Purohit";<[hidden email]>;
Send time: Friday, Dec 8, 2017 1:04 AM
To: "dev"<[hidden email]>;

Subject:  Re: 回复: [DISCUSSION] Refactory on spark related modules



+1 agreed with ravindra

-Sandeep

On Thu, Dec 7, 2017 at 5:26 AM, Sangeeta Gulia <[hidden email]>
wrote:

> +1 to Ravindra's suggestion.
>
> On Wed, Dec 6, 2017 at 12:56 PM, Ravindra Pesala <[hidden email]>
> wrote:
>
> > Hi Jacky,
> >
> > I don't think it's a good idea to create new modules for spark2.1 and
> > spark2.2 versions.We should not create a module for every spark minor
> > version. Earlier we had a modules spark and spark2 because of major
> version
> > change and a lot of interfaces are changed along with it. If it is a
> major
> > version we can create module but not for every minor version. And
> regarding
> > the IDE issue, it is just a developer understanding of how to switch the
> > versions so we can have FAQ for that and also we can check the solution
> for
> > IDE problem.
> >
> > And about merging spark2 and spark-common module, There is no harm in
> > keeping all the RDD in common package as if you need to support any major
> > spark versions like spark3.0 in future may require separating the modules
> > again.
> >
> > Regards,
> > Ravindra.
> >
> > On 6 December 2017 at 11:00, wyphao.2007 <[hidden email]> wrote:
> >
> > > +1
> > >
> > >
> > >
> > >
> > >
> > >
> > > 在2017年12月06 11时44分, "岑玉海"<[hidden email]>写道:
> > >
> > > +1
> > >
> > >
> > >
> > >
> > >
> > >
> > > Best regards!
> > > Yuhai Cen
> > >
> > >
> > > 在2017年12月6日 11:43,David CaiQiang<[hidden email]> 写道:
> > > +1
> > >
> > >
> > >
> > > -----
> > > Best Regards
> > > David Cai
> > > --
> > > Sent from: http://apache-carbondata-dev-mailing-list-archive.1130556.
> > > n5.nabble.com/
> > >
> >
> >
> >
> > --
> > Thanks & Regards,
> > Ravi
> >
>
>
>
> --
> Warm Regards,
>
> Sangeeta Gulia
> Software Consultant
> Knoldus Software LLP
>
Reply | Threaded
Open this post in threaded view
|

Re: [DISCUSSION] Refactory on spark related modules

xuchuanyin
In reply to this post by Jacky Li
Reply | Threaded
Open this post in threaded view
|

Re: [DISCUSSION] Refactory on spark related modules

xubo245
In reply to this post by Jacky Li
Who will Refactory on spark related modules?



--
Sent from: http://apache-carbondata-dev-mailing-list-archive.1130556.n5.nabble.com/
Reply | Threaded
Open this post in threaded view
|

Re: [DISCUSSION] Refactory on spark related modules

sraghunandan
I feel not required to do now. Let's wait for breaking changes in spark
before we do the refacforing

On Tue, 4 Dec 2018, 8:23 am xubo245, <[hidden email]> wrote:

> Who will Refactory on spark related modules?
>
>
>
> --
> Sent from:
> http://apache-carbondata-dev-mailing-list-archive.1130556.n5.nabble.com/
>