Should we use Spark 2.2.1 as default version for Spark-2.2 supported

classic Classic list List threaded Threaded
7 messages Options
Reply | Threaded
Open this post in threaded view
|

Should we use Spark 2.2.1 as default version for Spark-2.2 supported

xm_zzc
Hi dev:
  As everyone knows, spark 2.2.1 is the maintenance release version for
branch spark 2.2, it contains many  stability fixes, and Spark community
also strongly recommends all 2.2.x users to upgrade to this stable release,
so should we use spark 2.2.1 as the default version for Spark-2.2 supported?



--
Sent from: http://apache-carbondata-dev-mailing-list-archive.1130556.n5.nabble.com/
Reply | Threaded
Open this post in threaded view
|

Re: Should we use Spark 2.2.1 as default version for Spark-2.2 supported

Jacky Li
+1
Yes, I think for spark-2.2 profile, change default spark.version to spark 2.2.1 is better.

Regards,
Jacky

> 在 2017年12月26日,上午12:20,xm_zzc <[hidden email]> 写道:
>
> Hi dev:
>  As everyone knows, spark 2.2.1 is the maintenance release version for
> branch spark 2.2, it contains many  stability fixes, and Spark community
> also strongly recommends all 2.2.x users to upgrade to this stable release,
> so should we use spark 2.2.1 as the default version for Spark-2.2 supported?
>
>
> --
> Sent from: http://apache-carbondata-dev-mailing-list-archive.1130556.n5.nabble.com/



Reply | Threaded
Open this post in threaded view
|

Re: Should we use Spark 2.2.1 as default version for Spark-2.2 supported

Liang Chen
Administrator
In reply to this post by xm_zzc
Hi

+1 from my side. Just i checked spark 2.2.1, there are more than 200+ issues
fixed.
But should remind one thing: if the proposal be accepted by community , will
use 2.2.1 to replace 2.2.0 as spark version of integration module, *would
not support both.*

Regards
Liang

xm_zzc wrote

> Hi dev:
>   As everyone knows, spark 2.2.1 is the maintenance release version for
> branch spark 2.2, it contains many  stability fixes, and Spark community
> also strongly recommends all 2.2.x users to upgrade to this stable
> release,
> so should we use spark 2.2.1 as the default version for Spark-2.2
> supported?
>
>
>
> --
> Sent from:
> http://apache-carbondata-dev-mailing-list-archive.1130556.n5.nabble.com/





--
Sent from: http://apache-carbondata-dev-mailing-list-archive.1130556.n5.nabble.com/
Reply | Threaded
Open this post in threaded view
|

Re: Should we use Spark 2.2.1 as default version for Spark-2.2 supported

xm_zzc
In reply to this post by xm_zzc
Hi dev:
  Any suggestions about this?  @ravipesala  @David CaiQiang or others.




--
Sent from: http://apache-carbondata-dev-mailing-list-archive.1130556.n5.nabble.com/
Reply | Threaded
Open this post in threaded view
|

Re: Should we use Spark 2.2.1 as default version for Spark-2.2 supported

sraghunandan
+1
On Tue, 2 Jan 2018 at 8:13 AM, xm_zzc <[hidden email]> wrote:

> Hi dev:
>   Any suggestions about this?  @ravipesala  @David CaiQiang or others.
>
>
>
>
> --
> Sent from:
> http://apache-carbondata-dev-mailing-list-archive.1130556.n5.nabble.com/
>
Reply | Threaded
Open this post in threaded view
|

Re: Should we use Spark 2.2.1 as default version for Spark-2.2 supported

sounak
In reply to this post by xm_zzc
+1

Agree with Liang, if 2.2.1 is accepted then it should replace 2.2.0 and we
should not maintain both.

On Tue, Jan 2, 2018 at 8:13 AM, xm_zzc <[hidden email]> wrote:

> Hi dev:
>   Any suggestions about this?  @ravipesala  @David CaiQiang or others.
>
>
>
>
> --
> Sent from: http://apache-carbondata-dev-mailing-list-archive.1130556.
> n5.nabble.com/
>



--
Thanks
Sounak
Reply | Threaded
Open this post in threaded view
|

Re: Should we use Spark 2.2.1 as default version for Spark-2.2 supported

xm_zzc
Thanks for your replies, Jacky, Liang, Raghunandan S and sounak.
I will raise a jira task and pr to upgrade spark version to 2.2.1.



--
Sent from: http://apache-carbondata-dev-mailing-list-archive.1130556.n5.nabble.com/