Login  Register

Re: Dictionary file is locked for updation

Posted by ravipesala on Dec 27, 2016; 8:15am
URL: http://apache-carbondata-dev-mailing-list-archive.168.s1.nabble.com/Dictionary-file-is-locked-for-updation-tp5076p5087.html

Hi,

It seems the store path location is taking default location. Did you set
the store location properly? Which spark version you are using?

Regards,
Ravindra

On Tue, Dec 27, 2016, 1:38 PM 251469031 <[hidden email]> wrote:

> Hi Kumar,
>
>
>   thx to your repley, the full logs is as follows:
>
>
> 16/12/27 12:30:17 INFO locks.HdfsFileLock: Executor task launch worker-0
> HDFS lock
> path:hdfs://master:9000../carbon.store/default/test_table/2e9b7efa-2934-463a-9280-ff50c5129268.lock
> 16/12/27 12:30:17 INFO storage.ShuffleBlockFetcherIterator: Getting 1
> non-empty blocks out of 1 blocks
> 16/12/27 12:30:17 INFO storage.ShuffleBlockFetcherIterator: Started 1
> remote fetches in 1 ms
> 16/12/27 12:30:32 ERROR rdd.CarbonGlobalDictionaryGenerateRDD: Executor
> task launch worker-0
> java.lang.RuntimeException: Dictionary file name is locked for updation.
> Please try after some time
>         at scala.sys.package$.error(package.scala:27)
>         at
> org.apache.carbondata.spark.rdd.CarbonGlobalDictionaryGenerateRDD$$anon$1.<init>(CarbonGlobalDictionaryRDD.scala:364)
>         at
> org.apache.carbondata.spark.rdd.CarbonGlobalDictionaryGenerateRDD.compute(CarbonGlobalDictionaryRDD.scala:302)
>         at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
>         at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
>         at
> org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
>         at org.apache.spark.scheduler.Task.run(Task.scala:89)
>         at
> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:213)
>         at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>         at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>         at java.lang.Thread.run(Thread.java:745)
>
>
>
> as u see, the lock file path
> is:hdfs://master:9000../carbon.store/default/test_table/2e9b7efa-2934-463a-9280-ff50c5129268.lock
>
>
>
>
> ------------------ 原始邮件 ------------------
> 发件人: "Kumar Vishal";<[hidden email]>;
> 发送时间: 2016年12月27日(星期二) 下午3:25
> 收件人: "dev"<[hidden email]>;
>
> 主题: Re: Dictionary file is locked for updation
>
>
>
> Hi,
> can you please find *"HDFS lock path"* string in executor log and let me
> know the complete log message.
>
> -Regards
> Kumar Vishal
>
> On Tue, Dec 27, 2016 at 12:45 PM, 251469031 <[hidden email]> wrote:
>
> > Hi all,
> >
> >
> > when I run the following script:
> > scala> cc.sql(s"load data inpath
> 'hdfs://master:9000/carbondata/sample.csv'
> > into table test_table")
> >
> >
> > it turns out that:
> > WARN  27-12 12:37:58,044 - Lost task 1.3 in stage 2.0 (TID 13, slave1):
> > java.lang.RuntimeException: Dictionary file name is locked for updation.
> > Please try after some time
> >
> >
> > what I have done are:
> > 1.in carbon.properties, set carbon.lock.type=HDFSLOCK
> > 2.send carbon.properties & spark-defaults.conf to all nodes of the
> clusters
> >
> >
> > if any of you have any idea, looking forward to your replay, thx~