Should we use Spark 2.2.1 as default version for Spark-2.2 supported
Posted by
xm_zzc on
Dec 25, 2017; 4:20pm
URL: http://apache-carbondata-dev-mailing-list-archive.168.s1.nabble.com/Should-we-use-Spark-2-2-1-as-default-version-for-Spark-2-2-supported-tp32713.html
Hi dev:
As everyone knows, spark 2.2.1 is the maintenance release version for
branch spark 2.2, it contains many stability fixes, and Spark community
also strongly recommends all 2.2.x users to upgrade to this stable release,
so should we use spark 2.2.1 as the default version for Spark-2.2 supported?
--
Sent from:
http://apache-carbondata-dev-mailing-list-archive.1130556.n5.nabble.com/