CatalystAnalysy

classic Classic list List threaded Threaded
6 messages Options
Reply | Threaded
Open this post in threaded view
|

CatalystAnalysy

rajeevkumar
Can anybody tell me why i am getting following exception when i try to run create table query on cluster.
My spark version is 1.6 and also i have compiled it with -Pspark1.6.

 java.lang.AbstractMethodError: org.apache.spark.sql.CarbonContext$$anon$1.org$apache$spark$sql$catalyst$analysis$OverrideCatalog$_setter_$org$apache$spark$sql$catalyst$analysis$OverrideCatalog$$overrides_$eq(Ljava/util/concurrent/ConcurrentHashMap;)V
  at org.apache.spark.sql.catalyst.analysis.OverrideCatalog$class.$init$(Catalog.scala:132)
  at org.apache.spark.sql.CarbonContext$$anon$1.<init>(CarbonContext.scala:70)
  at org.apache.spark.sql.CarbonContext.catalog$lzycompute(CarbonContext.scala:70)
  at org.apache.spark.sql.CarbonContext.catalog(CarbonContext.scala:67)
  at org.apache.spark.sql.CarbonContext$$anon$2.<init>(CarbonContext.scala:75)
  at org.apache.spark.sql.CarbonContext.analyzer$lzycompute(CarbonContext.scala:75)
  at org.apache.spark.sql.CarbonContext.analyzer(CarbonContext.scala:74)
  at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:34)
  at org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:133)
  at org.apache.spark.sql.CarbonContext.sql(CarbonContext.scala:137)
  ...
Reply | Threaded
Open this post in threaded view
|

Re: CatalystAnalysy

David CaiQiang
This post was updated on .
You can find that spark.version will be 1.6.2 for the spark-1.6 profile in carbondata-parent pom.xml file

So if your spark version is not 1.6.2, I think you can provide more parameter(-Dspark.version=<your spark version>) to mvn command.

e.g.
If spark.version is 1.6.0, like the following.
-Dspark.version=1.6.0


Best regards
David Cai
Best Regards
David Cai
Reply | Threaded
Open this post in threaded view
|

Re: CatalystAnalysy

rahul_kumar
hey QiangCai,
thank you for your reply . i have spark 1.6.2. and also tried with -Dspark.version=1.6.2 . But result is same . Still i am getting same exception.

Is this exception possibe if i have different scala version?
Reply | Threaded
Open this post in threaded view
|

Re: CatalystAnalysy

ravipesala
Have you used 'mvn clean'?

On 28 December 2016 at 07:18, rahulforallp <[hidden email]> wrote:

> hey QiangCai,
> thank you for your reply . i have spark 1.6.2. and also tried with
> -Dspark.version=1.6.2 . But result is same . Still i am getting same
> exception.
>
> Is this exception possibe if i have different scala version?
>
>
>
> --
> View this message in context: http://apache-carbondata-
> mailing-list-archive.1130556.n5.nabble.com/CatalystAnalysy-
> tp5129p5137.html
> Sent from the Apache CarbonData Mailing List archive mailing list archive
> at Nabble.com.
>



--
Thanks & Regards,
Ravi
Reply | Threaded
Open this post in threaded view
|

Re: CatalystAnalysy

David CaiQiang
In reply to this post by rahul_kumar
You can try -Dscala.version=<your scala version>
Best Regards
David Cai
Reply | Threaded
Open this post in threaded view
|

Re: CatalystAnalysy

rajeevkumar
i have done mvn clean .
Also changed scala version but exception is same .