store location can't be found

classic Classic list List threaded Threaded
7 messages Options
Reply | Threaded
Open this post in threaded view
|

store location can't be found

Mars Xu
Hello All,
  I met a problem of file not exist. it looks like the store location can’t be found. I have already set carbon.store.location=hdfs://localhost:9000/carbon/store <hdfs://localhost:9000/carbon/store> in $SPARK_HOME/conf/carbon.properties, but when I start up spark-shell by following command and run some commands ,the error is coming
spark-shell --master spark://localhost:7077 --jars ~/carbonlib/carbondata_2.11-1.0.0-incubating-shade-hadoop2.7.2.jar --conf spark.carbon.storepath=hdfs://localhost:9000/carbon/store <hdfs://localhost:9000/carbon/store>

scala> import org.apache.spark.sql.SparkSession
scala> import org.apache.spark.sql.CarbonSession._
scala> val carbon = SparkSession.builder().config(sc.getConf).getOrCreateCarbonSession()
scala> carbon.sql("CREATE TABLE IF NOT EXISTS test_table(id string, name string, city string, age Int) STORED BY 'carbondata’")
scala> carbon.sql("load data inpath 'hdfs://localhost:9000/resources/sample.csv' into table test_table”)

scala> carbon.sql("select * from test_table").show()
java.io.FileNotFoundException: File /private/var/carbon.store/default/test_table/Fact/Part0/Segment_0 does not exist.
  at org.apache.hadoop.hdfs.DistributedFileSystem$DirListingIterator.<init>(DistributedFileSystem.java:948)
  at org.apache.hadoop.hdfs.DistributedFileSystem$DirListingIterator.<init>(DistributedFileSystem.java:927)

My carbonate version is 1.0 and spark version is spark 2.1.
Reply | Threaded
Open this post in threaded view
|

Re: store location can't be found

ravipesala
Hi Mars,

Please try creating carbonsession with storepath as follow.

val carbon = SparkSession.builder().config(sc.getConf).
getOrCreateCarbonSession("hdfs://localhost:9000/carbon/store ")


Regards,
Ravindra.

On 4 February 2017 at 08:12, Mars Xu <[hidden email]> wrote:

> Hello All,
>         I met a problem of file not exist. it looks like the store
> location can’t be found. I have already set carbon.store.location=hdfs://localhost:9000/carbon/store
> <hdfs://localhost:9000/carbon/store> in $SPARK_HOME/conf/carbon.properties,
> but when I start up spark-shell by following command and run some commands
> ,the error is coming
> spark-shell --master spark://localhost:7077 --jars
> ~/carbonlib/carbondata_2.11-1.0.0-incubating-shade-hadoop2.7.2.jar --conf
> spark.carbon.storepath=hdfs://localhost:9000/carbon/store
> <hdfs://localhost:9000/carbon/store>
>
> scala> import org.apache.spark.sql.SparkSession
> scala> import org.apache.spark.sql.CarbonSession._
> scala> val carbon = SparkSession.builder().config(sc.getConf).
> getOrCreateCarbonSession()
> scala> carbon.sql("CREATE TABLE IF NOT EXISTS test_table(id string, name
> string, city string, age Int) STORED BY 'carbondata’")
> scala> carbon.sql("load data inpath 'hdfs://localhost:9000/resources/sample.csv'
> into table test_table”)
>
> scala> carbon.sql("select * from test_table").show()
> java.io.FileNotFoundException: File /private/var/carbon.store/
> default/test_table/Fact/Part0/Segment_0 does not exist.
>   at org.apache.hadoop.hdfs.DistributedFileSystem$
> DirListingIterator.<init>(DistributedFileSystem.java:948)
>   at org.apache.hadoop.hdfs.DistributedFileSystem$
> DirListingIterator.<init>(DistributedFileSystem.java:927)
>
> My carbonate version is 1.0 and spark version is spark 2.1.




--
Thanks & Regards,
Ravi
Reply | Threaded
Open this post in threaded view
|

Re: store location can't be found

Liang Chen
Administrator
In reply to this post by Mars Xu
Hi

Have you configured as per the guide :
https://github.com/apache/incubator-carbondata/blob/master/docs/installation-guide.md

Regards
Liang

2017-02-04 10:42 GMT+08:00 Mars Xu <[hidden email]>:

> Hello All,
>         I met a problem of file not exist. it looks like the store
> location can’t be found. I have already set carbon.store.location=hdfs://localhost:9000/carbon/store
> <hdfs://localhost:9000/carbon/store> in $SPARK_HOME/conf/carbon.properties,
> but when I start up spark-shell by following command and run some commands
> ,the error is coming
> spark-shell --master spark://localhost:7077 --jars
> ~/carbonlib/carbondata_2.11-1.0.0-incubating-shade-hadoop2.7.2.jar --conf
> spark.carbon.storepath=hdfs://localhost:9000/carbon/store
> <hdfs://localhost:9000/carbon/store>
>
> scala> import org.apache.spark.sql.SparkSession
> scala> import org.apache.spark.sql.CarbonSession._
> scala> val carbon = SparkSession.builder().config(sc.getConf).
> getOrCreateCarbonSession()
> scala> carbon.sql("CREATE TABLE IF NOT EXISTS test_table(id string, name
> string, city string, age Int) STORED BY 'carbondata’")
> scala> carbon.sql("load data inpath 'hdfs://localhost:9000/resources/sample.csv'
> into table test_table”)
>
> scala> carbon.sql("select * from test_table").show()
> java.io.FileNotFoundException: File /private/var/carbon.store/
> default/test_table/Fact/Part0/Segment_0 does not exist.
>   at org.apache.hadoop.hdfs.DistributedFileSystem$
> DirListingIterator.<init>(DistributedFileSystem.java:948)
>   at org.apache.hadoop.hdfs.DistributedFileSystem$
> DirListingIterator.<init>(DistributedFileSystem.java:927)
>
> My carbonate version is 1.0 and spark version is spark 2.1.




--
Regards
Liang
Reply | Threaded
Open this post in threaded view
|

Re: store location can't be found

Mars Xu
In reply to this post by ravipesala
Hi Ravindra,

      I follow your suggestion to create carbon session with store path, but when load data it through out another error as follow :

scala> carbon.sql("LOAD DATA INPATH 'hdfs://localhost:9000/resources/sample.csv' INTO TABLE test_table")
org.apache.spark.sql.AnalysisException: LOAD DATA is not supported for datasource tables: `default`.`test_table`;
  at org.apache.spark.sql.execution.command.LoadDataCommand.run(tables.scala:194)
  at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:58)

do you know what wrong with this ?

> 在 2017年2月4日,上午11:49,Ravindra Pesala <[hidden email]> 写道:
>
> Hi Mars,
>
> Please try creating carbonsession with storepath as follow.
>
> val carbon = SparkSession.builder().config(sc.getConf).
> getOrCreateCarbonSession("hdfs://localhost:9000/carbon/store <hdfs://localhost:9000/carbon/store> ")
>
>
> Regards,
> Ravindra.
>
> On 4 February 2017 at 08:12, Mars Xu <[hidden email] <mailto:[hidden email]>> wrote:
>
>> Hello All,
>>        I met a problem of file not exist. it looks like the store
>> location can’t be found. I have already set carbon.store.location=hdfs://localhost:9000/carbon/store <hdfs://localhost:9000/carbon/store>
>> <hdfs://localhost:9000/carbon/store <hdfs://localhost:9000/carbon/store>> in $SPARK_HOME/conf/carbon.properties,
>> but when I start up spark-shell by following command and run some commands
>> ,the error is coming
>> spark-shell --master spark://localhost:7077 <spark://localhost:7077> --jars
>> ~/carbonlib/carbondata_2.11-1.0.0-incubating-shade-hadoop2.7.2.jar --conf
>> spark.carbon.storepath=hdfs://localhost:9000/carbon/store <hdfs://localhost:9000/carbon/store>
>> <hdfs://localhost:9000/carbon/store>
>>
>> scala> import org.apache.spark.sql.SparkSession
>> scala> import org.apache.spark.sql.CarbonSession._
>> scala> val carbon = SparkSession.builder().config(sc.getConf).
>> getOrCreateCarbonSession()
>> scala> carbon.sql("CREATE TABLE IF NOT EXISTS test_table(id string, name
>> string, city string, age Int) STORED BY 'carbondata’")
>> scala> carbon.sql("load data inpath 'hdfs://localhost:9000/resources/sample.csv'
>> into table test_table”)
>>
>> scala> carbon.sql("select * from test_table").show()
>> java.io.FileNotFoundException: File /private/var/carbon.store/
>> default/test_table/Fact/Part0/Segment_0 does not exist.
>>  at org.apache.hadoop.hdfs.DistributedFileSystem$
>> DirListingIterator.<init>(DistributedFileSystem.java:948)
>>  at org.apache.hadoop.hdfs.DistributedFileSystem$
>> DirListingIterator.<init>(DistributedFileSystem.java:927)
>>
>> My carbonate version is 1.0 and spark version is spark 2.1.
>
>
>
>
> --
> Thanks & Regards,
> Ravi

Reply | Threaded
Open this post in threaded view
|

Re: store location can't be found

Mars Xu
In reply to this post by ravipesala
Hello Ravindra,

      I have solved this problem. Thanks.
 

> 在 2017年2月4日,上午11:49,Ravindra Pesala <[hidden email]> 写道:
>
> Hi Mars,
>
> Please try creating carbonsession with storepath as follow.
>
> val carbon = SparkSession.builder().config(sc.getConf).
> getOrCreateCarbonSession("hdfs://localhost:9000/carbon/store <hdfs://localhost:9000/carbon/store> ")
>
>
> Regards,
> Ravindra.
>
> On 4 February 2017 at 08:12, Mars Xu <[hidden email] <mailto:[hidden email]>> wrote:
>
>> Hello All,
>>        I met a problem of file not exist. it looks like the store
>> location can’t be found. I have already set carbon.store.location=hdfs://localhost:9000/carbon/store <hdfs://localhost:9000/carbon/store>
>> <hdfs://localhost:9000/carbon/store <hdfs://localhost:9000/carbon/store>> in $SPARK_HOME/conf/carbon.properties,
>> but when I start up spark-shell by following command and run some commands
>> ,the error is coming
>> spark-shell --master spark://localhost:7077 <spark://localhost:7077> --jars
>> ~/carbonlib/carbondata_2.11-1.0.0-incubating-shade-hadoop2.7.2.jar --conf
>> spark.carbon.storepath=hdfs://localhost:9000/carbon/store <hdfs://localhost:9000/carbon/store>
>> <hdfs://localhost:9000/carbon/store>
>>
>> scala> import org.apache.spark.sql.SparkSession
>> scala> import org.apache.spark.sql.CarbonSession._
>> scala> val carbon = SparkSession.builder().config(sc.getConf).
>> getOrCreateCarbonSession()
>> scala> carbon.sql("CREATE TABLE IF NOT EXISTS test_table(id string, name
>> string, city string, age Int) STORED BY 'carbondata’")
>> scala> carbon.sql("load data inpath 'hdfs://localhost:9000/resources/sample.csv'
>> into table test_table”)
>>
>> scala> carbon.sql("select * from test_table").show()
>> java.io.FileNotFoundException: File /private/var/carbon.store/
>> default/test_table/Fact/Part0/Segment_0 does not exist.
>>  at org.apache.hadoop.hdfs.DistributedFileSystem$
>> DirListingIterator.<init>(DistributedFileSystem.java:948)
>>  at org.apache.hadoop.hdfs.DistributedFileSystem$
>> DirListingIterator.<init>(DistributedFileSystem.java:927)
>>
>> My carbonate version is 1.0 and spark version is spark 2.1.
>
>
>
>
> --
> Thanks & Regards,
> Ravi

Reply | Threaded
Open this post in threaded view
|

Re: store location can't be found

Liang Chen
Administrator
Hi Mars

Can you share how you solved this issue ?

Regards
Liang

2017-02-04 15:54 GMT+08:00 Mars Xu <[hidden email]>:

> Hello Ravindra,
>
>         I have solved this problem. Thanks.
>
> > 在 2017年2月4日,上午11:49,Ravindra Pesala <[hidden email]> 写道:
> >
> > Hi Mars,
> >
> > Please try creating carbonsession with storepath as follow.
> >
> > val carbon = SparkSession.builder().config(sc.getConf).
> > getOrCreateCarbonSession("hdfs://localhost:9000/carbon/store
> <hdfs://localhost:9000/carbon/store> ")
> >
> >
> > Regards,
> > Ravindra.
> >
> > On 4 February 2017 at 08:12, Mars Xu <[hidden email] <mailto:
> [hidden email]>> wrote:
> >
> >> Hello All,
> >>        I met a problem of file not exist. it looks like the store
> >> location can’t be found. I have already set
> carbon.store.location=hdfs://localhost:9000/carbon/store
> <hdfs://localhost:9000/carbon/store>
> >> <hdfs://localhost:9000/carbon/store <hdfs://localhost:9000/carbon/store>>
> in $SPARK_HOME/conf/carbon.properties,
> >> but when I start up spark-shell by following command and run some
> commands
> >> ,the error is coming
> >> spark-shell --master spark://localhost:7077 <spark://localhost:7077>
> --jars
> >> ~/carbonlib/carbondata_2.11-1.0.0-incubating-shade-hadoop2.7.2.jar
> --conf
> >> spark.carbon.storepath=hdfs://localhost:9000/carbon/store
> <hdfs://localhost:9000/carbon/store>
> >> <hdfs://localhost:9000/carbon/store>
> >>
> >> scala> import org.apache.spark.sql.SparkSession
> >> scala> import org.apache.spark.sql.CarbonSession._
> >> scala> val carbon = SparkSession.builder().config(sc.getConf).
> >> getOrCreateCarbonSession()
> >> scala> carbon.sql("CREATE TABLE IF NOT EXISTS test_table(id string, name
> >> string, city string, age Int) STORED BY 'carbondata’")
> >> scala> carbon.sql("load data inpath 'hdfs://localhost:9000/
> resources/sample.csv'
> >> into table test_table”)
> >>
> >> scala> carbon.sql("select * from test_table").show()
> >> java.io.FileNotFoundException: File /private/var/carbon.store/
> >> default/test_table/Fact/Part0/Segment_0 does not exist.
> >>  at org.apache.hadoop.hdfs.DistributedFileSystem$
> >> DirListingIterator.<init>(DistributedFileSystem.java:948)
> >>  at org.apache.hadoop.hdfs.DistributedFileSystem$
> >> DirListingIterator.<init>(DistributedFileSystem.java:927)
> >>
> >> My carbonate version is 1.0 and spark version is spark 2.1.
> >
> >
> >
> >
> > --
> > Thanks & Regards,
> > Ravi
>
>


--
Regards
Liang
Reply | Threaded
Open this post in threaded view
|

Re: store location can't be found

Mars Xu
Hi Liang,
 
        As Ravindran suggest, I create carbonsession with storepath as follow:
   val carbon = SparkSession.builder().config(sc.getConf).getOrCreateCarbonSession("hdfs://localhost:9000/carbon/store”)



> 在 2017年2月4日,下午5:06,Liang Chen <[hidden email]> 写道:
>
>>> Ravindra.