Re: Should we use Spark 2.2.1 as default version for Spark-2.2 supported
Posted by
Liang Chen on
Dec 26, 2017; 2:20pm
URL: http://apache-carbondata-dev-mailing-list-archive.168.s1.nabble.com/Should-we-use-Spark-2-2-1-as-default-version-for-Spark-2-2-supported-tp32713p32793.html
Hi
+1 from my side. Just i checked spark 2.2.1, there are more than 200+ issues
fixed.
But should remind one thing: if the proposal be accepted by community , will
use 2.2.1 to replace 2.2.0 as spark version of integration module, *would
not support both.*
Regards
Liang
xm_zzc wrote
> Hi dev:
> As everyone knows, spark 2.2.1 is the maintenance release version for
> branch spark 2.2, it contains many stability fixes, and Spark community
> also strongly recommends all 2.2.x users to upgrade to this stable
> release,
> so should we use spark 2.2.1 as the default version for Spark-2.2
> supported?
>
>
>
> --
> Sent from:
>
http://apache-carbondata-dev-mailing-list-archive.1130556.n5.nabble.com/--
Sent from:
http://apache-carbondata-dev-mailing-list-archive.1130556.n5.nabble.com/