[jira] [Resolved] (CARBONDATA-3329) DeadLock is observed when a query fails.

classic Classic list List threaded Threaded
1 message Options
Reply | Threaded
Open this post in threaded view
|

[jira] [Resolved] (CARBONDATA-3329) DeadLock is observed when a query fails.

Akash R Nilugal (Jira)

     [ https://issues.apache.org/jira/browse/CARBONDATA-3329?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Ravindra Pesala resolved CARBONDATA-3329.
-----------------------------------------
       Resolution: Fixed
    Fix Version/s: 1.5.3

> DeadLock is observed when a query fails.
> ----------------------------------------
>
>                 Key: CARBONDATA-3329
>                 URL: https://issues.apache.org/jira/browse/CARBONDATA-3329
>             Project: CarbonData
>          Issue Type: New Feature
>            Reporter: Kunal Kapoor
>            Assignee: Kunal Kapoor
>            Priority: Major
>             Fix For: 1.5.3
>
>          Time Spent: 40m
>  Remaining Estimate: 0h
>
> "HiveServer2-Handler-Pool: Thread-303" #303 prio=5 os_prio=0 tid=0x00007fcfe129f800 nid=0x59eb9 waiting for monitor entry [0x00007fcfd3c42000]
>  java.lang.Thread.State: BLOCKED (on object monitor)
>  at org.apache.log4j.Category.callAppenders(Category.java:204)
>  - waiting to lock <0x00007fd046f9ed60> (a org.apache.log4j.spi.RootLogger)
>  at org.apache.log4j.Category.forcedLog(Category.java:391)
>  at org.apache.log4j.Category.log(Category.java:856)
>  at org.slf4j.impl.Log4jLoggerAdapter.log(Log4jLoggerAdapter.java:581)
>  at org.apache.commons.logging.impl.SLF4JLocationAwareLog.info(SLF4JLocationAwareLog.java:155)
>  at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.close(HiveMetaStoreClient.java:622)
>  at sun.reflect.GeneratedMethodAccessor33.invoke(Unknown Source)
>  at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>  at java.lang.reflect.Method.invoke(Method.java:498)
>  at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:156)
>  at com.sun.proxy.$Proxy28.close(Unknown Source)
>  at sun.reflect.GeneratedMethodAccessor33.invoke(Unknown Source)
>  at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>  at java.lang.reflect.Method.invoke(Method.java:498)
>  at org.apache.hadoop.hive.metastore.HiveMetaStoreClient$SynchronizedHandler.invoke(HiveMetaStoreClient.java:2107)
>  - locked <0x00007fd05611ef38> (a org.apache.hadoop.hive.metastore.HiveMetaStoreClient$SynchronizedHandler)
>  at com.sun.proxy.$Proxy28.close(Unknown Source)
>  at org.apache.hadoop.hive.ql.metadata.Hive.close(Hive.java:294)
>  at org.apache.hadoop.hive.ql.metadata.Hive.access$000(Hive.java:141)
>  at org.apache.hadoop.hive.ql.metadata.Hive$1.remove(Hive.java:161)
>  - locked <0x00007fd051ba0bb0> (a org.apache.hadoop.hive.ql.metadata.Hive$1)
>  at org.apache.hadoop.hive.ql.metadata.Hive.closeCurrent(Hive.java:264)
>  at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$withHiveState$1.apply(HiveClientImpl.scala:294)
>  at org.apache.spark.sql.hive.client.HiveClientImpl.liftedTree1$1(HiveClientImpl.scala:246)
>  at org.apache.spark.sql.hive.client.HiveClientImpl.retryLocked(HiveClientImpl.scala:245)
>  - locked <0x00007fd04cbc4c78> (a org.apache.spark.sql.hive.client.hiveClientObject)
>  at org.apache.spark.sql.hive.client.HiveClientImpl.withHiveState(HiveClientImpl.scala:292)
>  at org.apache.spark.sql.hive.client.HiveClientImpl.databaseExists(HiveClientImpl.scala:388)
>  at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply$mcZ$sp(HiveExternalCatalog.scala:178)
>  at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:178)
>  at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:178)
>  at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:97)
>  - locked <0x00007fd04ce5ff48> (a org.apache.spark.sql.hive.HiveExternalCatalog)
>  at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:177)
>  at org.apache.spark.sql.catalyst.catalog.SessionCatalog.databaseExists(SessionCatalog.scala:198)
>  at org.apache.spark.sql.catalyst.catalog.SessionCatalog.org$apache$spark$sql$catalyst$catalog$SessionCatalog$$requireDbExists(SessionCatalog.scala:138)
>  at org.apache.spark.sql.catalyst.catalog.SessionCatalog.getDatabaseMetadata(SessionCatalog.scala:192)
>  at org.apache.spark.sql.getDB$.getDBLocation(CarbonCatalystOperators.scala:107)
>  at org.apache.spark.sql.hive.CarbonMetastore$$anonfun$loadMetadata$1.apply(CarbonMetastore.scala:253)
>  at org.apache.spark.sql.hive.CarbonMetastore$$anonfun$loadMetadata$1.apply(CarbonMetastore.scala:251)
>  at scala.collection.Iterator$class.foreach(Iterator.scala:893)
>  at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
>  at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
>  at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
>  at org.apache.spark.sql.hive.CarbonMetastore.loadMetadata(CarbonMetastore.scala:251)
>  at org.apache.spark.sql.hive.CarbonMetastore.<init>(CarbonMetastore.scala:134)
>  at org.apache.spark.sql.CarbonEnv.init(CarbonEnv.scala:83)
>  at org.apache.spark.sql.hive.CarbonSessionCatalog.carbonEnv$lzycompute(CarbonSessionState.scala:72)
>  - locked <0x00007fd058434528> (a org.apache.spark.sql.hive.CarbonSessionCatalog)
>  at org.apache.spark.sql.hive.CarbonSessionCatalog.carbonEnv(CarbonSessionState.scala:70)
>  at org.apache.spark.sql.CarbonEnv$.getInstance(CarbonEnv.scala:123)
>  at org.apache.spark.sql.CarbonSession$.updateSessionInfoToCurrentThread(CarbonSession.scala:192)
>  at org.apache.spark.sql.parser.CarbonSparkSqlParser.parsePlan(CarbonSparkSqlParser.scala:46)
>  at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:600)
>  at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:699)
>  at org.apache.spark.sql.hive.thriftserver.SparkSQLSessionManager.openSession(SparkSQLSessionManager.scala:100)
>  at org.apache.hive.service.cli.CLIService.openSession(CLIService.java:194)
>  at org.apache.hive.service.cli.thrift.ThriftCLIService.getSessionHandle(ThriftCLIService.java:652)
>  at org.apache.hive.service.cli.thrift.ThriftCLIService.OpenSession(ThriftCLIService.java:473)
>  at org.apache.hive.service.cli.thrift.TCLIService$Processor$OpenSession.getResult(TCLIService.java:1253)
>  at org.apache.hive.service.cli.thrift.TCLIService$Processor$OpenSession.getResult(TCLIService.java:1238)
>  at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
>  at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
>  at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingProcessor.process(HadoopThriftAuthBridge.java:690)
>  at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:286)
>  at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>  at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>  at java.lang.Thread.run(Thread.java:748)
> "pool-26-thread-3" #302 prio=5 os_prio=0 tid=0x00007fd03b231800 nid=0x59dc8 waiting for monitor entry [0x00007fcfd282d000]
>  java.lang.Thread.State: BLOCKED (on object monitor)
>  at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:97)
>  - waiting to lock <0x00007fd04ce5ff48> (a org.apache.spark.sql.hive.HiveExternalCatalog)
>  at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:177)
>  at org.apache.spark.sql.catalyst.catalog.SessionCatalog.databaseExists(SessionCatalog.scala:198)
>  at org.apache.spark.sql.catalyst.catalog.SessionCatalog.org$apache$spark$sql$catalyst$catalog$SessionCatalog$$requireDbExists(SessionCatalog.scala:138)
>  at org.apache.spark.sql.catalyst.catalog.SessionCatalog.getTableMetadata(SessionCatalog.scala:312)
>  at org.apache.spark.sql.hive.CarbonMetastore.indexInfoFromHive(CarbonMetastore.scala:363)
>  at org.apache.spark.sql.hive.CarbonMetastore.refreshIndexInfo(CarbonMetastore.scala:303)
>  at org.apache.spark.sql.hive.CarbonMetastore.lookupRelation(CarbonMetastore.scala:162)
>  at org.apache.spark.sql.hive.CarbonMetastore.lookupRelation(CarbonMetastore.scala:147)
>  at org.apache.spark.sql.CarbonDatasourceHadoopRelation.carbonRelation$lzycompute(CarbonDatasourceHadoopRelation.scala:65)
>  - locked <0x00007fd056400268> (a org.apache.spark.sql.CarbonDatasourceHadoopRelation)
>  at org.apache.spark.sql.CarbonDatasourceHadoopRelation.carbonRelation(CarbonDatasourceHadoopRelation.scala:63)
>  at org.apache.spark.sql.CarbonDatasourceHadoopRelation.carbonTable$lzycompute(CarbonDatasourceHadoopRelation.scala:62)
>  - locked <0x00007fd056400268> (a org.apache.spark.sql.CarbonDatasourceHadoopRelation)
>  at org.apache.spark.sql.CarbonDatasourceHadoopRelation.carbonTable(CarbonDatasourceHadoopRelation.scala:62)
>  at org.apache.spark.sql.CarbonDatasourceHadoopRelation.toString(CarbonDatasourceHadoopRelation.scala:88)
>  at java.lang.String.valueOf(String.java:2994)
>  at java.lang.StringBuilder.append(StringBuilder.java:131)
>  at scala.StringContext.standardInterpolator(StringContext.scala:125)
>  at scala.StringContext.s(StringContext.scala:95)
>  at org.apache.spark.sql.execution.datasources.LogicalRelation.simpleString(LogicalRelation.scala:106)
>  at org.apache.spark.sql.catalyst.plans.QueryPlan.verboseString(QueryPlan.scala:337)
>  at org.apache.spark.sql.catalyst.trees.TreeNode.generateTreeString(TreeNode.scala:552)
>  at org.apache.spark.sql.catalyst.trees.TreeNode.generateTreeString(TreeNode.scala:572)
>  at org.apache.spark.sql.catalyst.trees.TreeNode.generateTreeString(TreeNode.scala:572)
>  at org.apache.spark.sql.catalyst.trees.TreeNode.generateTreeString(TreeNode.scala:572)
>  at org.apache.spark.sql.catalyst.trees.TreeNode.generateTreeString(TreeNode.scala:572)
>  at org.apache.spark.sql.catalyst.trees.TreeNode.generateTreeString(TreeNode.scala:572)
>  at org.apache.spark.sql.catalyst.trees.TreeNode.generateTreeString(TreeNode.scala:572)
>  at org.apache.spark.sql.catalyst.trees.TreeNode.generateTreeString(TreeNode.scala:572)
>  at org.apache.spark.sql.catalyst.trees.TreeNode.generateTreeString(TreeNode.scala:572)
>  at org.apache.spark.sql.catalyst.trees.TreeNode.generateTreeString(TreeNode.scala:572)
>  at org.apache.spark.sql.catalyst.trees.TreeNode.generateTreeString(TreeNode.scala:572)
>  at org.apache.spark.sql.catalyst.trees.TreeNode.generateTreeString(TreeNode.scala:572)
>  at org.apache.spark.sql.catalyst.trees.TreeNode.treeString(TreeNode.scala:476)
>  at org.apache.spark.sql.catalyst.trees.TreeNode.treeString(TreeNode.scala:473)
>  at org.apache.spark.sql.catalyst.trees.TreeNode.toString(TreeNode.scala:470)
>  at java.lang.String.valueOf(String.java:2994)
>  at java.lang.StringBuilder.append(StringBuilder.java:131)
>  at scala.StringContext.standardInterpolator(StringContext.scala:125)
>  at scala.StringContext.s(StringContext.scala:95)
>  at org.apache.spark.sql.AnalysisException$$anonfun$1.apply(AnalysisException.scala:46)
>  at org.apache.spark.sql.AnalysisException$$anonfun$1.apply(AnalysisException.scala:46)
>  at scala.Option.map(Option.scala:146)
>  at org.apache.spark.sql.AnalysisException.getMessage(AnalysisException.scala:46)
>  at java.lang.Throwable.getLocalizedMessage(Throwable.java:391)
>  at java.lang.Throwable.toString(Throwable.java:480)
>  at java.lang.String.valueOf(String.java:2994)
>  at java.io.PrintWriter.println(PrintWriter.java:754)
>  at java.lang.Throwable$WrappedPrintWriter.println(Throwable.java:764)
>  at java.lang.Throwable.printStackTrace(Throwable.java:655)
>  - locked <0x00007fd0582f3a50> (a java.io.PrintWriter)
>  at java.lang.Throwable.printStackTrace(Throwable.java:721)
>  at org.apache.log4j.DefaultThrowableRenderer.render(DefaultThrowableRenderer.java:60)
>  at org.apache.log4j.spi.ThrowableInformation.getThrowableStrRep(ThrowableInformation.java:87)
>  - locked <0x00007fd0582f3af8> (a org.apache.log4j.spi.ThrowableInformation)
>  at org.apache.log4j.spi.LoggingEvent.getThrowableStrRep(LoggingEvent.java:413)
>  at org.apache.log4j.WriterAppender.subAppend(WriterAppender.java:313)
>  at org.apache.log4j.RollingFileAppender.subAppend(RollingFileAppender.java:276)
>  at org.apache.log4j.WriterAppender.append(WriterAppender.java:162)
>  at org.apache.log4j.AppenderSkeleton.doAppend(AppenderSkeleton.java:251)
>  - locked <0x00007fd046feecb0> (a org.apache.log4j.RollingFileAppender)
>  at org.apache.log4j.helpers.AppenderAttachableImpl.appendLoopOnAppenders(AppenderAttachableImpl.java:66)
>  at org.apache.log4j.Category.callAppenders(Category.java:206)
>  - locked <0x00007fd046f9ed60> (a org.apache.log4j.spi.RootLogger)
>  at org.apache.log4j.Category.forcedLog(Category.java:391)
>  at org.apache.log4j.Category.log(Category.java:856)
>  at org.slf4j.impl.Log4jLoggerAdapter.error(Log4jLoggerAdapter.java:576)
>  at org.apache.spark.internal.Logging$class.logError(Logging.scala:91)
>  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.logError(SparkExecuteStatementOperation.scala:51)
>  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.org$apache$spark$sql$hive$thriftserver$SparkExecuteStatementOperation$$execute(SparkExecuteStatementOperation.scala:324)
>  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$1$$anon$2.run(SparkExecuteStatementOperation.scala:183)
>  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$1$$anon$2.run(SparkExecuteStatementOperation.scala:180)
>  at java.security.AccessController.doPrivileged(Native Method)
>  at javax.security.auth.Subject.doAs(Subject.java:422)
>  
>  at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$1.run(SparkExecuteStatementOperation.scala:193)
>  at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
>  at java.util.concurrent.FutureTask.run(FutureTask.java:266)
>  at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>  at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>  at java.lang.Thread.run(Thread.java:748)



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)