java.lang.AbstractMethodError: org.apache.spark.sql.CarbonContext

classic Classic list List threaded Threaded
2 messages Options
Reply | Threaded
Open this post in threaded view
|

java.lang.AbstractMethodError: org.apache.spark.sql.CarbonContext

liu_hong_zhu@163.com

Dear Dev Team,
    When I Getting started with Apache CarbonData。This error block me。Why?

scala> cc.sql("create table if not exists test_table (id string, name string, city string, age Int) STORED BY 'carbondata'")
INFO  01-12 10:53:33,462 - main Query [CREATE TABLE IF NOT EXISTS TEST_TABLE (ID STRING, NAME STRING, CITY STRING, AGE INT) STORED BY 'CARBONDATA']
INFO  01-12 10:53:33,635 - Parsing command: create table if not exists test_table (id string, name string, city string, age Int) STORED BY 'carbondata'
INFO  01-12 10:53:34,481 - Parse Completed
java.lang.AbstractMethodError: org.apache.spark.sql.CarbonContext$$anon$1.org$apache$spark$sql$catalyst$analysis$OverrideCatalog$_setter_$org$apache$spark$sql$catalyst$analysis$OverrideCatalog$$overrides_$eq(Ljava/util/concurrent/ConcurrentHashMap;)V
at org.apache.spark.sql.catalyst.analysis.OverrideCatalog$class.$init$(Catalog.scala:132)
at org.apache.spark.sql.CarbonContext$$anon$1.<init>(CarbonContext.scala:71)
at org.apache.spark.sql.CarbonContext.catalog$lzycompute(CarbonContext.scala:71)
at org.apache.spark.sql.CarbonContext.catalog(CarbonContext.scala:68)
at org.apache.spark.sql.CarbonContext$$anon$2.<init>(CarbonContext.scala:76)
at org.apache.spark.sql.CarbonContext.analyzer$lzycompute(CarbonContext.scala:76)
at org.apache.spark.sql.CarbonContext.analyzer(CarbonContext.scala:75)
at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:34)
at org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:133)
at org.apache.spark.sql.CarbonContext.sql(CarbonContext.scala:138)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:41)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:46)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:48)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:50)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:52)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:54)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:56)
at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:58)
at $iwC$$iwC$$iwC$$iwC.<init>(<console>:60)
at $iwC$$iwC$$iwC.<init>(<console>:62)
at $iwC$$iwC.<init>(<console>:64)
at $iwC.<init>(<console>:66)
at <init>(<console>:68)
at .<init>(<console>:72)
at .<clinit>(<console>)
at .<init>(<console>:7)
at .<clinit>(<console>)
at $print(<console>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346)
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:657)
at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:665)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:670)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:997)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.spark.repl.carbon.Main$.main(Main.scala:31)
at org.apache.spark.repl.carbon.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)


[hidden email]
Reply | Threaded
Open this post in threaded view
|

Re: java.lang.AbstractMethodError: org.apache.spark.sql.CarbonContext

ZhuWilliam
Do you build the package from source? If so, please check spark version.
Make sure your spark assembly version is 1.6 and build carbondata with
profile specified eg.

mvn package -Pspark-1.6