Re: Should we use Spark 2.2.1 as default version for Spark-2.2 supported
Posted by
Jacky Li on
Dec 26, 2017; 2:06pm
URL: http://apache-carbondata-dev-mailing-list-archive.168.s1.nabble.com/Should-we-use-Spark-2-2-1-as-default-version-for-Spark-2-2-supported-tp32713p32791.html
+1
Yes, I think for spark-2.2 profile, change default spark.version to spark 2.2.1 is better.
Regards,
Jacky
> 在 2017年12月26日,上午12:20,xm_zzc <
[hidden email]> 写道:
>
> Hi dev:
> As everyone knows, spark 2.2.1 is the maintenance release version for
> branch spark 2.2, it contains many stability fixes, and Spark community
> also strongly recommends all 2.2.x users to upgrade to this stable release,
> so should we use spark 2.2.1 as the default version for Spark-2.2 supported?
>
>
> --
> Sent from:
http://apache-carbondata-dev-mailing-list-archive.1130556.n5.nabble.com/