the storepath in carbon.properties seems not work

classic Classic list List threaded Threaded
3 messages Options
Reply | Threaded
Open this post in threaded view
|

the storepath in carbon.properties seems not work

李寅威
Hi all:


    I'm now configing carbondata in cluster mode, and some configurations in the file carbon.properties are as bellow:


carbon.storelocation=hdfs://master:9000/carbondata
carbon.ddl.base.hdfs.url=hdfs://master:9000/carbondata/data
carbon.kettle.home=/opt/spark-1.6.0/carbonlib/carbonplugins



but when I create a table using the command:
cc.sql("create table if not exists test_table (id string, name string, city string, age Int) STORED BY 'carbondata'")


the output in the spark shell says the tablePath is a local one:
tablePath=/home/hadoop/carbondata/bin/carbonshellstore/default/test_table


and the storePath is also shown as :


scala> print(cc.storePath)
/home/hadoop/carbondata/bin/carbonshellstore



the file carbon.properties has been sent to all the nodes in the cluster. I doult where can I modify the config, looking forward to your help, thx~
Reply | Threaded
Open this post in threaded view
|

Re: the storepath in carbon.properties seems not work

David CaiQiang
Please find cwiki guide and take attention to -Dcarbon.properties.filepath configuration.

https://cwiki.apache.org/confluence/display/CARBONDATA/Cluster+deployment+guide

For spark-shell, CarbonContext need another proper storepath parameter.

e.g. new CarbonContext(sc, storePath)

But  new CarbonContext(sc) will just use default local path.

Best Regards
David Cai
Reply | Threaded
Open this post in threaded view
|

Re: the storepath in carbon.properties seems not work

杰
In reply to this post by 李寅威
hi,
  do u configure right carbon.properties file path in spark-default.conf?
 or please share your spark-default.conf file.


regards
Jay




------------------ Original ------------------
From:  "251469031";<[hidden email]>;
Date:  Mon, Dec 26, 2016 05:44 PM
To:  "dev"<[hidden email]>;

Subject:  the storepath in carbon.properties seems not work



Hi all:


    I'm now configing carbondata in cluster mode, and some configurations in the file carbon.properties are as bellow:


carbon.storelocation=hdfs://master:9000/carbondata
carbon.ddl.base.hdfs.url=hdfs://master:9000/carbondata/data
carbon.kettle.home=/opt/spark-1.6.0/carbonlib/carbonplugins



but when I create a table using the command:
cc.sql("create table if not exists test_table (id string, name string, city string, age Int) STORED BY 'carbondata'")


the output in the spark shell says the tablePath is a local one:
tablePath=/home/hadoop/carbondata/bin/carbonshellstore/default/test_table


and the storePath is also shown as :


scala> print(cc.storePath)
/home/hadoop/carbondata/bin/carbonshellstore



the file carbon.properties has been sent to all the nodes in the cluster. I doult where can I modify the config, looking forward to your help, thx~