Re: Error while loading - Table is locked for updation. Please try after some time ( Spark 1.6.2 )

Posted by Sanoj M George on
URL: http://apache-carbondata-dev-mailing-list-archive.168.s1.nabble.com/Error-while-loading-Table-is-locked-for-updation-Please-try-after-some-time-Spark-1-6-2-tp7335p7367.html

Hi Manish,

Could not find any .lock files incarbon store.

I am getting the error while running spark-shell, did not try thrift server. However, as you can see from attached logs, it is taking the default store location ( not the one from carbon.properties )

scala> cc.storePath
res0: String = /home/cduser/carbon.store


Thanks,
Sanoj




On Mon, Feb 6, 2017 at 1:23 PM, manish gupta <[hidden email]> wrote:
Hi Sanoj,

Please check if there is any file with .lock extension in the carbon store.

Also when you start thrift server carbon store location will be printed in
the thrift server logs. Please validate if there is nay mismatch in the
store location provided by you and the store location getting printed in
the thrift server logs.

Also please provide the complete logs for failure.

Regards
Manish Gupta

On Mon, Feb 6, 2017 at 2:18 PM, Sanoj M George <[hidden email]>
wrote:

> Not yet resolved, still getting same error.
>
> On Mon, Feb 6, 2017 at 12:41 PM Raghunandan S <
> [hidden email]> wrote:
>
> > You mean the issue is resolved?
> >
> > Regards
> > Raghunandan
> >
> > On 06-Feb-2017 1:36 PM, "Sanoj M George" <[hidden email]> wrote:
> >
> > Thanks Raghunandan.  Checked the thread but it seems this error is due to
> > something else.
> >
> > Below are the parameters that I changed :
> >
> > **** carbon.properties :
> > carbon.storelocation=hdfs://localhost:9000/opt/CarbonStore
> > carbon.ddl.base.hdfs.url=hdfs://localhost:9000/opt/data
> > carbon.kettle.home=/home/cduser/spark/carbonlib/carbonplugins
> >
> > **** spark-defaults.conf  :
> > carbon.kettle.home
> > /home/cduser/spark/carbonlib/carbonplugins
> > spark.driver.extraJavaOptions
> > -Dcarbon.properties.filepath=/home/cduser/spark/conf/carbon.properties
> > spark.executor.extraJavaOptions
> > -Dcarbon.properties.filepath=/home/cduser/spark/conf/carbon.properties
> >
> > Although store location is specified in carbon.properties, spark-shell
> was
> > using "/home/cduser/carbon.store" as store location.
> >
> > Regards
> >
> > On Sun, Feb 5, 2017 at 4:49 PM, Raghunandan S <
> > [hidden email]> wrote:
> >
> > > Dear sanoj,
> > > Pls refer to
> > > http://apache-carbondata-mailing-list-archive.1130556.
> > > n5.nabble.com/Dictionary-file-is-locked-for-updation-td5076.html
> > >
> > > Let me know if this thread didn't address your problem.
> > >
> > > Regards
> > >
> > >
> > > On 05-Feb-2017 5:22 PM, "Sanoj M George" <[hidden email]>
> wrote:
> > >
> > > Hi All,
> > >
> > > I am getting below error while trying out Carbondata with Spark 1.6.2 /
> > > Hadoop 2.6.5 / Carbondata 1.
> > >
> > > ./bin/spark-shell --jars
> > > carbonlib/carbondata_2.10-1.1.0-incubating-SNAPSHOT-shade-
> hadoop2.2.0.jar
> > > scala> import org.apache.spark.sql.CarbonContext
> > > scala> val cc = new CarbonContext(sc)
> > > scala> cc.sql("CREATE TABLE IF NOT EXISTS t1 (id string, name string,
> > city
> > > string, age Int) STORED BY 'carbondata'")
> > > scala> cc.sql("LOAD DATA INPATH '/home/cduser/spark/sample.csv' INTO
> > TABLE
> > > t1")
> > > INFO  05-02 14:57:22,346 - main Query [LOAD DATA INPATH
> > > '/HOME/CDUSER/SPARK/SAMPLE.CSV' INTO TABLE T1]
> > > INFO  05-02 14:57:37,411 - Table MetaData Unlocked Successfully after
> > data
> > > load
> > > java.lang.RuntimeException: Table is locked for updation. Please try
> > after
> > > some time
> > >         at scala.sys.package$.error(package.scala:27)
> > >         at
> > > org.apache.spark.sql.execution.command.LoadTable.
> > > run(carbonTableSchema.scala:360)
> > >         at
> > > org.apache.spark.sql.execution.ExecutedCommand.
> > > sideEffectResult$lzycompute(
> > > commands.scala:58)
> > >         at
> > > org.apache.spark.sql.execution.ExecutedCommand.
> sideEffectResult(commands.
> > > scala:56)
> > >
> > >
> > > I followed the docs at
> > > https://github.com/apache/incubator-carbondata/blob/
> > > master/docs/installation-guide.md#installing-and-
> > > configuring-carbondata-on-
> > > standalone-spark-cluster
> > > <http://installation-guide.md#installing-and-configuring-
> > carbondata-on-%0Astandalone-spark-cluster>
> > > and
> > > https://github.com/apache/incubator-carbondata/blob/
> > > master/docs/quick-start-guide.md
> > > to install carbondata.
> > >
> > > While creating the table, I observed below WARN msg in the log :
> > >
> > > main Query [CREATE TABLE DEFAULT.T1 USING CARBONDATA OPTIONS (TABLENAME
> > > "DEFAULT.T1", TABLEPATH "/HOME/CDUSER/CARBON.STORE/DEFAULT/T1") ]
> > >
> > > WARN  05-02 14:34:30,656 - Couldn't find corresponding Hive SerDe for
> > data
> > > source provider carbondata. Persisting data source relation
> > `default`.`t1`
> > > into Hive metastore in Spark SQL specific format, which is NOT
> compatible
> > > with Hive.
> > > INFO  05-02 14:34:30,755 - 0: create_table: Table(tableName:t1,
> > > dbName:default, owner:cduser, createTime:1486290870, lastAccessTime:0,
> > > retention:0, sd:StorageDescriptor(cols:[FieldSchema(name:col,
> > > type:array<string>, comment:from deserializer)], location:null,
> > > inputFormat:org.apache.hadoop.mapred.SequenceFileInputFormat,
> > > outputFormat:org.apache.hadoop.hive.ql.io.
> HiveSequenceFileOutputFormat,
> > > compressed:false, numBuckets:-1, serdeInfo:SerDeInfo(name:null,
> > > serializationLib:org.apache.hadoop.hive.serde2.
> > > MetadataTypedColumnsetSerDe,
> > > parameters:{tablePath=/home/cduser/carbon.store/default/t1,
> > > serialization.format=1, tableName=default.t1}), bucketCols:[],
> > sortCols:[],
> > > parameters:{}, skewedInfo:SkewedInfo(skewedColNames:[],
> > > skewedColValues:[],
> > > skewedColValueLocationMaps:{})), partitionKeys:[],
> > > parameters:{EXTERNAL=TRUE, spark.sql.sources.provider=carbondata},
> > > viewOriginalText:null, viewExpandedText:null, tableType:MANAGED_TABLE,
> > > privileges:PrincipalPrivilegeSet(userPrivileges:{},
> groupPrivileges:null,
> > > rolePrivileges:null))
> > >
> > >
> > > Appreciate any help in resolving this.
> > >
> > > Thanks,
> > > Sanoj
> > >
> >
> --
> Sent from my iPhone
>


spark-shell-log.txt (49K) Download Attachment