[jira] [Updated] (CARBONDATA-2480) Search mode RuntimeException: Error while resolving filter expression

classic Classic list List threaded Threaded
1 message Options
Reply | Threaded
Open this post in threaded view
|

[jira] [Updated] (CARBONDATA-2480) Search mode RuntimeException: Error while resolving filter expression

Akash R Nilugal (Jira)

     [ https://issues.apache.org/jira/browse/CARBONDATA-2480?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

xubo245 updated CARBONDATA-2480:
--------------------------------
    Description:
There are some error when running org.apache.carbondata.examples.SearchModeExample or org.apache.carbondata.spark.testsuite.detailquery.SearchModeTestCase
{code:java}
java.lang.RuntimeException: Error while resolving filter expression
        at org.apache.carbondata.core.metadata.schema.table.CarbonTable.resolveFilter(CarbonTable.java:924)
        at org.apache.spark.rpc.Master.chooseFGDataMap(Master.scala:261)
        at org.apache.spark.rpc.Master.search(Master.scala:218)
        at org.apache.carbondata.store.SparkCarbonStore.search(SparkCarbonStore.scala:144)
        at org.apache.spark.sql.CarbonSession.runSearch(CarbonSession.scala:225)
        at org.apache.spark.sql.CarbonSession.org$apache$spark$sql$CarbonSession$$trySearchMode(CarbonSession.scala:180)
        at org.apache.spark.sql.CarbonSession$$anonfun$sql$1.apply(CarbonSession.scala:100)
        at org.apache.spark.sql.CarbonSession$$anonfun$sql$1.apply(CarbonSession.scala:97)
        at org.apache.spark.sql.CarbonSession.withProfiler(CarbonSession.scala:156)
        at org.apache.spark.sql.CarbonSession.sql(CarbonSession.scala:95)
        at org.apache.carbondata.examples.SearchModeExample$$anonfun$3$$anon$1.run(SearchModeExample.scala:168)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run$$$capture(FutureTask.java:266)
        at java.util.concurrent.FutureTask.run(FutureTask.java)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.NullPointerException
        at org.apache.carbondata.core.scan.filter.resolver.RowLevelFilterResolverImpl.resolve(RowLevelFilterResolverImpl.java:74)
        at org.apache.carbondata.core.scan.filter.FilterExpressionProcessor.traverseAndResolveTree(FilterExpressionProcessor.java:252)
        at org.apache.carbondata.core.scan.filter.FilterExpressionProcessor.traverseAndResolveTree(FilterExpressionProcessor.java:251)
        at org.apache.carbondata.core.scan.filter.FilterExpressionProcessor.getFilterResolvertree(FilterExpressionProcessor.java:232)
        at org.apache.carbondata.core.scan.filter.FilterExpressionProcessor.getFilterResolver(FilterExpressionProcessor.java:84)
        at org.apache.carbondata.core.metadata.schema.table.CarbonTable.resolveFilter(CarbonTable.java:921)
        ... 16 more
18/05/15 11:27:00 ERROR CarbonSession: Exception when executing search mode: Error while resolving filter expression, fallback to SparkSQL
search mode asynchronous query
java.lang.RuntimeException: Error while resolving filter expression
        at org.apache.carbondata.core.metadata.schema.table.CarbonTable.resolveFilter(CarbonTable.java:924)
        at org.apache.spark.rpc.Master.chooseFGDataMap(Master.scala:261)
        at org.apache.spark.rpc.Master.search(Master.scala:218)
        at org.apache.carbondata.store.SparkCarbonStore.search(SparkCarbonStore.scala:144)
        at org.apache.spark.sql.CarbonSession.runSearch(CarbonSession.scala:225)
        at org.apache.spark.sql.CarbonSession.org$apache$spark$sql$CarbonSession$$trySearchMode(CarbonSession.scala:180)
        at org.apache.spark.sql.CarbonSession$$anonfun$sql$1.apply(CarbonSession.scala:100)
        at org.apache.spark.sql.CarbonSession$$anonfun$sql$1.apply(CarbonSession.scala:97)
        at org.apache.spark.sql.CarbonSession.withProfiler(CarbonSession.scala:156)
        at org.apache.spark.sql.CarbonSession.sql(CarbonSession.scala:95)
        at org.apache.carbondata.examples.SearchModeExample$$anonfun$3$$anon$1.run(SearchModeExample.scala:168)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run$$$capture(FutureTask.java:266)
        at java.util.concurrent.FutureTask.run(FutureTask.java)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.NullPointerException
        at org.apache.carbondata.core.scan.filter.resolver.RowLevelFilterResolverImpl.resolve(RowLevelFilterResolverImpl.java:74)
        at org.apache.carbondata.core.scan.filter.FilterExpressionProcessor.traverseAndResolveTree(FilterExpressionProcessor.java:252)
        at org.apache.carbondata.core.scan.filter.FilterExpressionProcessor.traverseAndResolveTree(FilterExpressionProcessor.java:251)
        at org.apache.carbondata.core.scan.filter.FilterExpressionProcessor.getFilterResolvertree(FilterExpressionProcessor.java:232)
        at org.apache.carbondata.core.scan.filter.FilterExpressionProcessor.getFilterResolver(FilterExpressionProcessor.java:84)
        at org.apache.carbondata.core.metadata.schema.table.CarbonTable.resolveFilter(CarbonTable.java:921)
        ... 16 more
18/05/15 11:27:03 ERROR CarbonSession: Exception when executing search mode: Error while resolving filter expression, fallback to SparkSQL
java.lang.RuntimeException: Error while resolving filter expression
        at org.apache.carbondata.core.metadata.schema.table.CarbonTable.resolveFilter(CarbonTable.java:924)
        at org.apache.spark.rpc.Master.chooseFGDataMap(Master.scala:261)
        at org.apache.spark.rpc.Master.search(Master.scala:218)
        at org.apache.carbondata.store.SparkCarbonStore.search(SparkCarbonStore.scala:144)
        at org.apache.spark.sql.CarbonSession.runSearch(CarbonSession.scala:225)
        at org.apache.spark.sql.CarbonSession.org$apache$spark$sql$CarbonSession$$trySearchMode(CarbonSession.scala:180)
        at org.apache.spark.sql.CarbonSession$$anonfun$sql$1.apply(CarbonSession.scala:100)
        at org.apache.spark.sql.CarbonSession$$anonfun$sql$1.apply(CarbonSession.scala:97)
        at org.apache.spark.sql.CarbonSession.withProfiler(CarbonSession.scala:156)
        at org.apache.spark.sql.CarbonSession.sql(CarbonSession.scala:95)
        at org.apache.carbondata.examples.SearchModeExample$$anonfun$3$$anon$1.run(SearchModeExample.scala:168)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run$$$capture(FutureTask.java:266)
        at java.util.concurrent.FutureTask.run(FutureTask.java)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.NullPointerException
        at org.apache.carbondata.core.scan.filter.resolver.RowLevelFilterResolverImpl.resolve(RowLevelFilterResolverImpl.java:74)
        at org.apache.carbondata.core.scan.filter.FilterExpressionProcessor.traverseAndResolveTree(FilterExpressionProcessor.java:252)
        at org.apache.carbondata.core.scan.filter.FilterExpressionProcessor.traverseAndResolveTree(FilterExpressionProcessor.java:251)
        at org.apache.carbondata.core.scan.filter.FilterExpressionProcessor.getFilterResolvertree(FilterExpressionProcessor.java:232)
        at org.apache.carbondata.core.scan.filter.FilterExpressionProcessor.getFilterResolver(FilterExpressionProcessor.java:84)
        at org.apache.carbondata.core.metadata.schema.table.CarbonTable.resolveFilter(CarbonTable.java:921)
        ... 16 more
18/05/15 11:27:04 ERROR CarbonSession: Exception when executing search mode: Error while resolving filter expression, fallback to SparkSQL
java.lang.RuntimeException: Error while resolving filter expression
        at org.apache.carbondata.core.metadata.schema.table.CarbonTable.resolveFilter(CarbonTable.java:924)
        at org.apache.spark.rpc.Master.chooseFGDataMap(Master.scala:261)
        at org.apache.spark.rpc.Master.search(Master.scala:218)
        at org.apache.carbondata.store.SparkCarbonStore.search(SparkCarbonStore.scala:144)
        at org.apache.spark.sql.CarbonSession.runSearch(CarbonSession.scala:225)
        at org.apache.spark.sql.CarbonSession.org$apache$spark$sql$CarbonSession$$trySearchMode(CarbonSession.scala:180)
        at org.apache.spark.sql.CarbonSession$$anonfun$sql$1.apply(CarbonSession.scala:100)
        at org.apache.spark.sql.CarbonSession$$anonfun$sql$1.apply(CarbonSession.scala:97)
        at org.apache.spark.sql.CarbonSession.withProfiler(CarbonSession.scala:156)
        at org.apache.spark.sql.CarbonSession.sql(CarbonSession.scala:95)
        at org.apache.carbondata.examples.SearchModeExample$$anonfun$3$$anon$1.run(SearchModeExample.scala:168)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run$$$capture(FutureTask.java:266)
        at java.util.concurrent.FutureTask.run(FutureTask.java)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.NullPointerException
        at org.apache.carbondata.core.scan.filter.resolver.RowLevelFilterResolverImpl.resolve(RowLevelFilterResolverImpl.java:74)
        at org.apache.carbondata.core.scan.filter.FilterExpressionProcessor.traverseAndResolveTree(FilterExpressionProcessor.java:252)
        at org.apache.carbondata.core.scan.filter.FilterExpressionProcessor.traverseAndResolveTree(FilterExpressionProcessor.java:251)
        at org.apache.carbondata.core.scan.filter.FilterExpressionProcessor.getFilterResolvertree(FilterExpressionProcessor.java:232)
        at org.apache.carbondata.core.scan.filter.FilterExpressionProcessor.getFilterResolver(FilterExpressionProcessor.java:84)
        at org.apache.carbondata.core.metadata.schema.table.CarbonTable.resolveFilter(CarbonTable.java:921)
        ... 16 more
18/05/15 11:27:04 ERROR CarbonSession: Exception when executing search mode: Error while resolving filter expression, fallback to SparkSQL
java.lang.RuntimeException: Error while resolving filter expression
        at org.apache.carbondata.core.metadata.schema.table.CarbonTable.resolveFilter(CarbonTable.java:924)
        at org.apache.spark.rpc.Master.chooseFGDataMap(Master.scala:261)
        at org.apache.spark.rpc.Master.search(Master.scala:218)
        at org.apache.carbondata.store.SparkCarbonStore.search(SparkCarbonStore.scala:144)
        at org.apache.spark.sql.CarbonSession.runSearch(CarbonSession.scala:225)
        at org.apache.spark.sql.CarbonSession.org$apache$spark$sql$CarbonSession$$trySearchMode(CarbonSession.scala:180)
        at org.apache.spark.sql.CarbonSession$$anonfun$sql$1.apply(CarbonSession.scala:100)
        at org.apache.spark.sql.CarbonSession$$anonfun$sql$1.apply(CarbonSession.scala:97)
        at org.apache.spark.sql.CarbonSession.withProfiler(CarbonSession.scala:156)
        at org.apache.spark.sql.CarbonSession.sql(CarbonSession.scala:95)
        at org.apache.carbondata.examples.SearchModeExample$$anonfun$3$$anon$1.run(SearchModeExample.scala:168)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run$$$capture(FutureTask.java:266)
        at java.util.concurrent.FutureTask.run(FutureTask.java)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.NullPointerException
        at org.apache.carbondata.core.scan.filter.resolver.RowLevelFilterResolverImpl.resolve(RowLevelFilterResolverImpl.java:74)
        at org.apache.carbondata.core.scan.filter.FilterExpressionProcessor.traverseAndResolveTree(FilterExpressionProcessor.java:252)
        at org.apache.carbondata.core.scan.filter.FilterExpressionProcessor.traverseAndResolveTree(FilterExpressionProcessor.java:251)
        at org.apache.carbondata.core.scan.filter.FilterExpressionProcessor.getFilterResolvertree(FilterExpressionProcessor.java:232)
        at org.apache.carbondata.core.scan.filter.FilterExpressionProcessor.getFilterResolver(FilterExpressionProcessor.java:84)
        at org.apache.carbondata.core.metadata.schema.table.CarbonTable.resolveFilter(CarbonTable.java:921)
        ... 16 more
{code}


The second error :


{code:java}
18/05/14 20:51:20 ERROR TransportRequestHandler: Error while invoking RpcHandler#receive() on RPC id 6929993627477870468
java.io.InvalidClassException: scala.collection.convert.Wrappers$MutableSetWrapper; no valid constructor
        at java.io.ObjectStreamClass$ExceptionInfo.newInvalidClassException(ObjectStreamClass.java:157)
        at java.io.ObjectStreamClass.checkDeserialize(ObjectStreamClass.java:862)
        at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2041)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1571)
        at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2285)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2209)
        at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2067)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1571)
        at java.io.ObjectInputStream.readObject(ObjectInputStream.java:431)
        at java.util.ArrayList.readObject(ArrayList.java:797)
        at sun.reflect.GeneratedMethodAccessor53.invoke(Unknown Source)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1158)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2176)
        at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2067)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1571)
        at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2285)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2209)
        at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2067)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1571)
        at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2285)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2209)
        at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2067)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1571)
        at java.io.ObjectInputStream.readObject(ObjectInputStream.java:431)
        at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75)
        at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:108)
        at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1$$anonfun$apply$1.apply(NettyRpcEnv.scala:270)
        at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
        at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:319)
        at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1.apply(NettyRpcEnv.scala:269)
        at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
        at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:268)
        at org.apache.spark.rpc.netty.RequestMessage$.apply(NettyRpcEnv.scala:603)
        at org.apache.spark.rpc.netty.NettyRpcHandler.internalReceive(NettyRpcEnv.scala:654)
        at org.apache.spark.rpc.netty.NettyRpcHandler.receive(NettyRpcEnv.scala:639)
        at org.apache.spark.network.server.TransportRequestHandler.processRpcRequest(TransportRequestHandler.java:157)
        at org.apache.spark.network.server.TransportRequestHandler.handle(TransportRequestHandler.java:105)
        at org.apache.spark.network.server.TransportChannelHandler.channelRead(TransportChannelHandler.java:118)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
        at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346)
        at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:266)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
        at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346)
        at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
        at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346)
        at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:85)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
        at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346)
        at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
        at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911)
        at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
        at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:652)
        at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:575)
        at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:489)
        at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:451)
        at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:140)
        at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
        at java.lang.Thread.run(Thread.java:748)
org.apache.spark.SparkException: Exception thrown in awaitResult:
        at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:205)
        at org.apache.spark.rpc.Master$$anonfun$search$1.apply(Master.scala:236)
        at org.apache.spark.rpc.Master$$anonfun$search$1.apply(Master.scala:231)
        at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
        at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
        at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
        at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
        at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
        at org.apache.spark.rpc.Master.search(Master.scala:231)
        at org.apache.carbondata.store.SparkCarbonStore.search(SparkCarbonStore.scala:144)
        at org.apache.spark.sql.CarbonSession.runSearch(CarbonSession.scala:225)
        at org.apache.spark.sql.CarbonSession.org$apache$spark$sql$CarbonSession$$trySearchMode(CarbonSession.scala:186)
        at org.apache.spark.sql.CarbonSession$$anonfun$sql$1.apply(CarbonSession.scala:100)
        at org.apache.spark.sql.CarbonSession$$anonfun$sql$1.apply(CarbonSession.scala:97)
        at org.apache.spark.sql.CarbonSession.withProfiler(CarbonSession.scala:156)
        at org.apache.spark.sql.CarbonSession.sql(CarbonSession.scala:95)
        at org.apache.spark.sql.test.Spark2TestQueryExecutor.sql(Spark2TestQueryExecutor.scala:35)
        at org.apache.spark.sql.test.util.QueryTest.sql(QueryTest.scala:113)
        at org.apache.carbondata.spark.testsuite.detailquery.SearchModeTestCase.org$apache$carbondata$spark$testsuite$detailquery$SearchModeTestCase$$checkSearchAnswer(SearchModeTestCase.scala:58)
        at org.apache.carbondata.spark.testsuite.detailquery.SearchModeTestCase$$anonfun$10$$anonfun$apply$mcV$sp$1.apply$mcV$sp(SearchModeTestCase.scala:108)
        at org.apache.carbondata.spark.testsuite.detailquery.SearchModeTestCase$$anonfun$10$$anonfun$apply$mcV$sp$1.apply(SearchModeTestCase.scala:105)
        at org.apache.carbondata.spark.testsuite.detailquery.SearchModeTestCase$$anonfun$10$$anonfun$apply$mcV$sp$1.apply(SearchModeTestCase.scala:105)
        at org.apache.spark.sql.catalyst.util.package$.benchmark(package.scala:129)
        at org.apache.carbondata.spark.testsuite.detailquery.SearchModeTestCase$$anonfun$10.apply$mcV$sp(SearchModeTestCase.scala:105)
        at org.apache.carbondata.spark.testsuite.detailquery.SearchModeTestCase$$anonfun$10.apply(SearchModeTestCase.scala:103)
        at org.apache.carbondata.spark.testsuite.detailquery.SearchModeTestCase$$anonfun$10.apply(SearchModeTestCase.scala:103)
        at org.scalatest.Transformer$$anonfun$apply$1.apply$mcV$sp(Transformer.scala:22)
        at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
        at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
        at org.scalatest.Transformer.apply(Transformer.scala:22)
        at org.scalatest.Transformer.apply(Transformer.scala:20)
        at org.scalatest.FunSuiteLike$$anon$1.apply(FunSuiteLike.scala:166)
        at org.apache.spark.sql.test.util.CarbonFunSuite.withFixture(CarbonFunSuite.scala:41)
        at org.scalatest.FunSuiteLike$class.invokeWithFixture$1(FunSuiteLike.scala:163)
        at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:175)
        at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:175)
        at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306)
        at org.scalatest.FunSuiteLike$class.runTest(FunSuiteLike.scala:175)
        at org.scalatest.FunSuite.runTest(FunSuite.scala:1555)
        at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:208)
        at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:208)
        at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:413)
        at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:401)
        at scala.collection.immutable.List.foreach(List.scala:381)
        at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401)
        at org.scalatest.SuperEngine.org$scalatest$SuperEngine$$runTestsInBranch(Engine.scala:396)
        at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:483)
        at org.scalatest.FunSuiteLike$class.runTests(FunSuiteLike.scala:208)
        at org.scalatest.FunSuite.runTests(FunSuite.scala:1555)
        at org.scalatest.Suite$class.run(Suite.scala:1424)
        at org.scalatest.FunSuite.org$scalatest$FunSuiteLike$$super$run(FunSuite.scala:1555)
        at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:212)
        at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:212)
        at org.scalatest.SuperEngine.runImpl(Engine.scala:545)
        at org.scalatest.FunSuiteLike$class.run(FunSuiteLike.scala:212)
        at org.apache.carbondata.spark.testsuite.detailquery.SearchModeTestCase.org$scalatest$BeforeAndAfterAll$$super$run(SearchModeTestCase.scala:31)
        at org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:257)
        at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:256)
        at org.apache.carbondata.spark.testsuite.detailquery.SearchModeTestCase.run(SearchModeTestCase.scala:31)
        at org.scalatest.tools.SuiteRunner.run(SuiteRunner.scala:55)
        at org.scalatest.tools.Runner$$anonfun$doRunRunRunDaDoRunRun$3.apply(Runner.scala:2563)
        at org.scalatest.tools.Runner$$anonfun$doRunRunRunDaDoRunRun$3.apply(Runner.scala:2557)
        at scala.collection.immutable.List.foreach(List.scala:381)
        at org.scalatest.tools.Runner$.doRunRunRunDaDoRunRun(Runner.scala:2557)
        at org.scalatest.tools.Runner$$anonfun$runOptionallyWithPassFailReporter$2.apply(Runner.scala:1044)
        at org.scalatest.tools.Runner$$anonfun$runOptionallyWithPassFailReporter$2.apply(Runner.scala:1043)
        at org.scalatest.tools.Runner$.withClassLoaderAndDispatchReporter(Runner.scala:2722)
        at org.scalatest.tools.Runner$.runOptionallyWithPassFailReporter(Runner.scala:1043)
        at org.scalatest.tools.Runner$.run(Runner.scala:883)
        at org.scalatest.tools.Runner.run(Runner.scala)
        at org.jetbrains.plugins.scala.testingSupport.scalaTest.ScalaTestRunner.runScalaTest2(ScalaTestRunner.java:131)
        at org.jetbrains.plugins.scala.testingSupport.scalaTest.ScalaTestRunner.main(ScalaTestRunner.java:28)
Caused by: java.lang.RuntimeException: java.io.InvalidClassException: scala.collection.convert.Wrappers$MutableSetWrapper; no valid constructor
        at java.io.ObjectStreamClass$ExceptionInfo.newInvalidClassException(ObjectStreamClass.java:157)
        at java.io.ObjectStreamClass.checkDeserialize(ObjectStreamClass.java:862)
        at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2041)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1571)
        at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2285)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2209)
        at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2067)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1571)
        at java.io.ObjectInputStream.readObject(ObjectInputStream.java:431)
        at java.util.ArrayList.readObject(ArrayList.java:797)
        at sun.reflect.GeneratedMethodAccessor53.invoke(Unknown Source)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1158)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2176)
        at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2067)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1571)
        at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2285)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2209)
        at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2067)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1571)
        at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2285)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2209)
        at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2067)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1571)
        at java.io.ObjectInputStream.readObject(ObjectInputStream.java:431)
        at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75)
        at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:108)
        at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1$$anonfun$apply$1.apply(NettyRpcEnv.scala:270)
        at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
        at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:319)
        at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1.apply(NettyRpcEnv.scala:269)
        at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
        at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:268)
        at org.apache.spark.rpc.netty.RequestMessage$.apply(NettyRpcEnv.scala:603)
        at org.apache.spark.rpc.netty.NettyRpcHandler.internalReceive(NettyRpcEnv.scala:654)
        at org.apache.spark.rpc.netty.NettyRpcHandler.receive(NettyRpcEnv.scala:639)
        at org.apache.spark.network.server.TransportRequestHandler.processRpcRequest(TransportRequestHandler.java:157)
        at org.apache.spark.network.server.TransportRequestHandler.handle(TransportRequestHandler.java:105)
        at org.apache.spark.network.server.TransportChannelHandler.channelRead(TransportChannelHandler.java:118)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
        at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346)
        at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:266)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
        at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346)
        at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
        at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346)
        at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:85)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
        at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346)
        at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
        at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911)
        at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
        at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:652)
        at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:575)
        at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:489)
        at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:451)
        at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:140)
        at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
        at java.lang.Thread.run(Thread.java:748)

        at org.apache.spark.network.client.TransportResponseHandler.handle(TransportResponseHandler.java:207)
        at org.apache.spark.network.server.TransportChannelHandler.channelRead(TransportChannelHandler.java:120)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
        at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346)
        at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:266)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
        at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346)
        at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
        at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346)
        at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:85)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
        at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346)
        at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
        at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911)
        at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
        at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:652)
        at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:575)
        at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:489)
        at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:451)
        at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:140)
        at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
        at java.lang.Thread.run(Thread.java:748)
18/05/14 20:51:20 ERROR CarbonSession: Exception when executing search mode: Exception thrown in awaitResult: , fallback to SparkSQL
18/05/14 20:51:20 INFO CarbonSparkSqlParser: Parsing command: select id from main where id = '3' limit 10
18/05/14 20:51:20 INFO HiveMetaStore: 0: get_table : db=default tbl=main
18/05/14 20:51:20 INFO audit: ugi=xubo ip=unknown-ip-addr cmd=get_table : db=default tbl=main
18/05/14 20:51:20 INFO CatalystSqlParser: Parsing command: array<string>
18/05/14 20:51:20 INFO CarbonLateDecodeRule: ScalaTest-run-running-SearchModeTestCase skip CarbonOptimizer
18/05/14 20:51:20 INFO CarbonLateDecodeRule: ScalaTest-run-running-SearchModeTestCase Skip CarbonOptimizer
18/05/14 20:51:20 INFO CodeGenerator: Code generated in 4.758886 ms
18/05/14 20:51:20 INFO DistributionUtil$: ScalaTest-run-running-SearchModeTestCase Executors configured : 1
18/05/14 20:51:20 INFO DistributionUtil$: ScalaTest-run-running-SearchModeTestCase Total Time taken to ensure the required executors : 1
18/05/14 20:51:20 INFO DistributionUtil$: ScalaTest-run-running-SearchModeTestCase Time elapsed to allocate the required executors: 0
18/05/14 20:51:20 INFO CarbonScanRDD:
 Identified no.of.blocks: 1,
 no.of.tasks: 1,
 no.of.nodes: 1,
 parallelism: 0
       
18/05/14 20:51:20 INFO SparkContext: Starting job: collect at SearchModeTestCase.scala:54
18/05/14 20:51:20 INFO DAGScheduler: Got job 17 (collect at SearchModeTestCase.scala:54) with 1 output partitions
18/05/14 20:51:20 INFO DAGScheduler: Final stage: ResultStage 23 (collect at SearchModeTestCase.scala:54)
18/05/14 20:51:20 INFO DAGScheduler: Parents of final stage: List()
18/05/14 20:51:20 INFO DAGScheduler: Missing parents: List()
18/05/14 20:51:20 INFO DAGScheduler: Submitting ResultStage 23 (MapPartitionsRDD[80] at collect at SearchModeTestCase.scala:54), which has no missing parents
18/05/14 20:51:20 INFO MemoryStore: Block broadcast_23 stored as values in memory (estimated size 32.1 KB, free 2004.0 MB)
18/05/14 20:51:20 INFO MemoryStore: Block broadcast_23_piece0 stored as bytes in memory (estimated size 26.5 KB, free 2004.0 MB)
18/05/14 20:51:20 INFO BlockManagerInfo: Added broadcast_23_piece0 in memory on 192.168.44.90:63398 (size: 26.5 KB, free: 2004.3 MB)
18/05/14 20:51:20 INFO SparkContext: Created broadcast 23 from broadcast at DAGScheduler.scala:1006
18/05/14 20:51:20 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 23 (MapPartitionsRDD[80] at collect at SearchModeTestCase.scala:54) (first 15 tasks are for partitions Vector(0))
18/05/14 20:51:20 INFO TaskSchedulerImpl: Adding task set 23.0 with 1 tasks
18/05/14 20:51:20 INFO TaskSetManager: Starting task 0.0 in stage 23.0 (TID 822, localhost, executor driver, partition 0, PROCESS_LOCAL, 5955 bytes)
18/05/14 20:51:20 INFO Executor: Running task 0.0 in stage 23.0 (TID 822)
18/05/14 20:51:20 INFO AbstractQueryExecutor: [Executor task launch worker for task 822][partitionID:main;queryID:65652430146138] Query will be executed on table: main
18/05/14 20:51:20 INFO ResultCollectorFactory: pool-36-thread-7 Row based dictionary collector is used to scan and collect the data
18/05/14 20:51:20 INFO UnsafeMemoryManager: [Executor task launch worker for task 822][partitionID:main;queryID:65652430146138] Total memory used after task 65652473133085 is 3687 Current tasks running now are : [65647314993698, 65639014191902, 65647481932757, 65644793091292, 65639048545043, 65641096541779]
18/05/14 20:51:20 INFO Executor: Finished task 0.0 in stage 23.0 (TID 822). 1162 bytes result sent to driver
18/05/14 20:51:20 INFO TaskSetManager: Finished task 0.0 in stage 23.0 (TID 822) in 27 ms on localhost (executor driver) (1/1)
18/05/14 20:51:20 INFO TaskSchedulerImpl: Removed TaskSet 23.0, whose tasks have all completed, from pool
18/05/14 20:51:20 INFO DAGScheduler: ResultStage 23 (collect at SearchModeTestCase.scala:54) finished in 0.027 s
18/05/14 20:51:20 INFO DAGScheduler: Job 17 finished: collect at SearchModeTestCase.scala:54, took 0.034065 s
18/05/14 20:51:20 INFO CarbonLateDecodeRule: ScalaTest-run-running-SearchModeTestCase skip CarbonOptimizer
18/05/14 20:51:20 INFO CarbonLateDecodeRule: ScalaTest-run-running-SearchModeTestCase Skip CarbonOptimizer
18/05/14 20:51:20 INFO DistributionUtil$: ScalaTest-run-running-SearchModeTestCase Executors configured : 1
18/05/14 20:51:20 INFO DistributionUtil$: ScalaTest-run-running-SearchModeTestCase Total Time taken to ensure the required executors : 0
18/05/14 20:51:20 INFO DistributionUtil$: ScalaTest-run-running-SearchModeTestCase Time elapsed to allocate the required executors: 0
18/05/14 20:51:20 INFO CarbonScanRDD:
 Identified no.of.blocks: 1,
 no.of.tasks: 1,
 no.of.nodes: 1,
 parallelism: 0
       
18/05/14 20:51:20 INFO SparkContext: Starting job: checkAnswer at SearchModeTestCase.scala:58
18/05/14 20:51:20 INFO DAGScheduler: Got job 18 (checkAnswer at SearchModeTestCase.scala:58) with 1 output partitions
18/05/14 20:51:20 INFO DAGScheduler: Final stage: ResultStage 24 (checkAnswer at SearchModeTestCase.scala:58)
18/05/14 20:51:20 INFO DAGScheduler: Parents of final stage: List()
18/05/14 20:51:20 INFO DAGScheduler: Missing parents: List()
18/05/14 20:51:20 INFO DAGScheduler: Submitting ResultStage 24 (MapPartitionsRDD[83] at checkAnswer at SearchModeTestCase.scala:58), which has no missing parents
18/05/14 20:51:20 INFO MemoryStore: Block broadcast_24 stored as values in memory (estimated size 32.1 KB, free 2003.9 MB)
18/05/14 20:51:20 INFO MemoryStore: Block broadcast_24_piece0 stored as bytes in memory (estimated size 26.5 KB, free 2003.9 MB)
18/05/14 20:51:20 INFO BlockManagerInfo: Added broadcast_24_piece0 in memory on 192.168.44.90:63398 (size: 26.5 KB, free: 2004.3 MB)
18/05/14 20:51:20 INFO SparkContext: Created broadcast 24 from broadcast at DAGScheduler.scala:1006
18/05/14 20:51:20 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 24 (MapPartitionsRDD[83] at checkAnswer at SearchModeTestCase.scala:58) (first 15 tasks are for partitions Vector(0))
18/05/14 20:51:20 INFO TaskSchedulerImpl: Adding task set 24.0 with 1 tasks
18/05/14 20:51:20 INFO TaskSetManager: Starting task 0.0 in stage 24.0 (TID 823, localhost, executor driver, partition 0, PROCESS_LOCAL, 5955 bytes)
18/05/14 20:51:20 INFO Executor: Running task 0.0 in stage 24.0 (TID 823)
18/05/14 20:51:20 INFO AbstractQueryExecutor: [Executor task launch worker for task 823][partitionID:main;queryID:65652512653847] Query will be executed on table: main
18/05/14 20:51:20 INFO ResultCollectorFactory: pool-36-thread-6 Row based dictionary collector is used to scan and collect the data
18/05/14 20:51:20 INFO UnsafeMemoryManager: [Executor task launch worker for task 823][partitionID:main;queryID:65652512653847] Total memory used after task 65652542497559 is 3687 Current tasks running now are : [65647314993698, 65639014191902, 65647481932757, 65644793091292, 65639048545043, 65641096541779]
18/05/14 20:51:20 INFO Executor: Finished task 0.0 in stage 24.0 (TID 823). 1162 bytes result sent to driver
18/05/14 20:51:20 INFO TaskSetManager: Finished task 0.0 in stage 24.0 (TID 823) in 29 ms on localhost (executor driver) (1/1)
18/05/14 20:51:20 INFO TaskSchedulerImpl: Removed TaskSet 24.0, whose tasks have all completed, from pool
18/05/14 20:51:20 INFO DAGScheduler: ResultStage 24 (checkAnswer at SearchModeTestCase.scala:58) finished in 0.030 s
18/05/14 20:51:20 INFO DAGScheduler: Job 18 finished: checkAnswer at SearchModeTestCase.scala:58, took 0.033859 s
18/05/14 20:51:20 INFO CarbonSparkSqlParser: Parsing command: set carbon.search.enabled = false
18/05/14 20:51:20 INFO CarbonLateDecodeRule: ScalaTest-run-running-SearchModeTestCase skip CarbonOptimizer
18/05/14 20:51:20 INFO CarbonLateDecodeRule: ScalaTest-run-running-SearchModeTestCase Skip CarbonOptimizer
18/05/14 20:51:20 AUDIT CacheProvider: [localhost][xubo][Thread-1]The key carbon.search.enabled with value false added in the session param
18/05/14 20:51:20 INFO SparkCarbonStore: ScalaTest-run-running-SearchModeTestCase Shutting down all workers...
18/05/14 20:51:20 INFO SearchRequestHandler: [dispatcher-event-loop-0][partitionID:main;queryID:65643146221485] Shutting down worker...
18/05/14 20:51:20 INFO SearchRequestHandler: [dispatcher-event-loop-0][partitionID:main;queryID:65643146221485] Worker shutted down
18/05/14 20:51:20 INFO SparkCarbonStore: ScalaTest-run-running-SearchModeTestCase All workers are shutted down
18/05/14 20:51:20 INFO SparkCarbonStore: ScalaTest-run-running-SearchModeTestCase Stopping master...
18/05/14 20:51:20 INFO Registry: dispatcher-event-loop-1 Registry Endpoint stopped
18/05/14 20:51:20 INFO SparkCarbonStore: ScalaTest-run-running-SearchModeTestCase Master stopped
18/05/14 20:51:20 INFO SearchModeTestCase: ScalaTest-run-running-SearchModeTestCase

===== FINISHED org.apache.carbondata.spark.testsuite.detailquery.SearchModeTestCase: 'set search mode' =====

248.566341ms
{code}

  was:
There are some error when running org.apache.carbondata.examples.SearchModeExample or org.apache.carbondata.spark.testsuite.detailquery.SearchModeTestCase
{code:java}
java.lang.RuntimeException: Error while resolving filter expression
        at org.apache.carbondata.core.metadata.schema.table.CarbonTable.resolveFilter(CarbonTable.java:924)
        at org.apache.spark.rpc.Master.chooseFGDataMap(Master.scala:261)
        at org.apache.spark.rpc.Master.search(Master.scala:218)
        at org.apache.carbondata.store.SparkCarbonStore.search(SparkCarbonStore.scala:144)
        at org.apache.spark.sql.CarbonSession.runSearch(CarbonSession.scala:225)
        at org.apache.spark.sql.CarbonSession.org$apache$spark$sql$CarbonSession$$trySearchMode(CarbonSession.scala:180)
        at org.apache.spark.sql.CarbonSession$$anonfun$sql$1.apply(CarbonSession.scala:100)
        at org.apache.spark.sql.CarbonSession$$anonfun$sql$1.apply(CarbonSession.scala:97)
        at org.apache.spark.sql.CarbonSession.withProfiler(CarbonSession.scala:156)
        at org.apache.spark.sql.CarbonSession.sql(CarbonSession.scala:95)
        at org.apache.carbondata.examples.SearchModeExample$$anonfun$3$$anon$1.run(SearchModeExample.scala:168)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run$$$capture(FutureTask.java:266)
        at java.util.concurrent.FutureTask.run(FutureTask.java)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.NullPointerException
        at org.apache.carbondata.core.scan.filter.resolver.RowLevelFilterResolverImpl.resolve(RowLevelFilterResolverImpl.java:74)
        at org.apache.carbondata.core.scan.filter.FilterExpressionProcessor.traverseAndResolveTree(FilterExpressionProcessor.java:252)
        at org.apache.carbondata.core.scan.filter.FilterExpressionProcessor.traverseAndResolveTree(FilterExpressionProcessor.java:251)
        at org.apache.carbondata.core.scan.filter.FilterExpressionProcessor.getFilterResolvertree(FilterExpressionProcessor.java:232)
        at org.apache.carbondata.core.scan.filter.FilterExpressionProcessor.getFilterResolver(FilterExpressionProcessor.java:84)
        at org.apache.carbondata.core.metadata.schema.table.CarbonTable.resolveFilter(CarbonTable.java:921)
        ... 16 more
18/05/15 11:27:00 ERROR CarbonSession: Exception when executing search mode: Error while resolving filter expression, fallback to SparkSQL
search mode asynchronous query
java.lang.RuntimeException: Error while resolving filter expression
        at org.apache.carbondata.core.metadata.schema.table.CarbonTable.resolveFilter(CarbonTable.java:924)
        at org.apache.spark.rpc.Master.chooseFGDataMap(Master.scala:261)
        at org.apache.spark.rpc.Master.search(Master.scala:218)
        at org.apache.carbondata.store.SparkCarbonStore.search(SparkCarbonStore.scala:144)
        at org.apache.spark.sql.CarbonSession.runSearch(CarbonSession.scala:225)
        at org.apache.spark.sql.CarbonSession.org$apache$spark$sql$CarbonSession$$trySearchMode(CarbonSession.scala:180)
        at org.apache.spark.sql.CarbonSession$$anonfun$sql$1.apply(CarbonSession.scala:100)
        at org.apache.spark.sql.CarbonSession$$anonfun$sql$1.apply(CarbonSession.scala:97)
        at org.apache.spark.sql.CarbonSession.withProfiler(CarbonSession.scala:156)
        at org.apache.spark.sql.CarbonSession.sql(CarbonSession.scala:95)
        at org.apache.carbondata.examples.SearchModeExample$$anonfun$3$$anon$1.run(SearchModeExample.scala:168)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run$$$capture(FutureTask.java:266)
        at java.util.concurrent.FutureTask.run(FutureTask.java)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.NullPointerException
        at org.apache.carbondata.core.scan.filter.resolver.RowLevelFilterResolverImpl.resolve(RowLevelFilterResolverImpl.java:74)
        at org.apache.carbondata.core.scan.filter.FilterExpressionProcessor.traverseAndResolveTree(FilterExpressionProcessor.java:252)
        at org.apache.carbondata.core.scan.filter.FilterExpressionProcessor.traverseAndResolveTree(FilterExpressionProcessor.java:251)
        at org.apache.carbondata.core.scan.filter.FilterExpressionProcessor.getFilterResolvertree(FilterExpressionProcessor.java:232)
        at org.apache.carbondata.core.scan.filter.FilterExpressionProcessor.getFilterResolver(FilterExpressionProcessor.java:84)
        at org.apache.carbondata.core.metadata.schema.table.CarbonTable.resolveFilter(CarbonTable.java:921)
        ... 16 more
18/05/15 11:27:03 ERROR CarbonSession: Exception when executing search mode: Error while resolving filter expression, fallback to SparkSQL
java.lang.RuntimeException: Error while resolving filter expression
        at org.apache.carbondata.core.metadata.schema.table.CarbonTable.resolveFilter(CarbonTable.java:924)
        at org.apache.spark.rpc.Master.chooseFGDataMap(Master.scala:261)
        at org.apache.spark.rpc.Master.search(Master.scala:218)
        at org.apache.carbondata.store.SparkCarbonStore.search(SparkCarbonStore.scala:144)
        at org.apache.spark.sql.CarbonSession.runSearch(CarbonSession.scala:225)
        at org.apache.spark.sql.CarbonSession.org$apache$spark$sql$CarbonSession$$trySearchMode(CarbonSession.scala:180)
        at org.apache.spark.sql.CarbonSession$$anonfun$sql$1.apply(CarbonSession.scala:100)
        at org.apache.spark.sql.CarbonSession$$anonfun$sql$1.apply(CarbonSession.scala:97)
        at org.apache.spark.sql.CarbonSession.withProfiler(CarbonSession.scala:156)
        at org.apache.spark.sql.CarbonSession.sql(CarbonSession.scala:95)
        at org.apache.carbondata.examples.SearchModeExample$$anonfun$3$$anon$1.run(SearchModeExample.scala:168)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run$$$capture(FutureTask.java:266)
        at java.util.concurrent.FutureTask.run(FutureTask.java)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.NullPointerException
        at org.apache.carbondata.core.scan.filter.resolver.RowLevelFilterResolverImpl.resolve(RowLevelFilterResolverImpl.java:74)
        at org.apache.carbondata.core.scan.filter.FilterExpressionProcessor.traverseAndResolveTree(FilterExpressionProcessor.java:252)
        at org.apache.carbondata.core.scan.filter.FilterExpressionProcessor.traverseAndResolveTree(FilterExpressionProcessor.java:251)
        at org.apache.carbondata.core.scan.filter.FilterExpressionProcessor.getFilterResolvertree(FilterExpressionProcessor.java:232)
        at org.apache.carbondata.core.scan.filter.FilterExpressionProcessor.getFilterResolver(FilterExpressionProcessor.java:84)
        at org.apache.carbondata.core.metadata.schema.table.CarbonTable.resolveFilter(CarbonTable.java:921)
        ... 16 more
18/05/15 11:27:04 ERROR CarbonSession: Exception when executing search mode: Error while resolving filter expression, fallback to SparkSQL
java.lang.RuntimeException: Error while resolving filter expression
        at org.apache.carbondata.core.metadata.schema.table.CarbonTable.resolveFilter(CarbonTable.java:924)
        at org.apache.spark.rpc.Master.chooseFGDataMap(Master.scala:261)
        at org.apache.spark.rpc.Master.search(Master.scala:218)
        at org.apache.carbondata.store.SparkCarbonStore.search(SparkCarbonStore.scala:144)
        at org.apache.spark.sql.CarbonSession.runSearch(CarbonSession.scala:225)
        at org.apache.spark.sql.CarbonSession.org$apache$spark$sql$CarbonSession$$trySearchMode(CarbonSession.scala:180)
        at org.apache.spark.sql.CarbonSession$$anonfun$sql$1.apply(CarbonSession.scala:100)
        at org.apache.spark.sql.CarbonSession$$anonfun$sql$1.apply(CarbonSession.scala:97)
        at org.apache.spark.sql.CarbonSession.withProfiler(CarbonSession.scala:156)
        at org.apache.spark.sql.CarbonSession.sql(CarbonSession.scala:95)
        at org.apache.carbondata.examples.SearchModeExample$$anonfun$3$$anon$1.run(SearchModeExample.scala:168)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run$$$capture(FutureTask.java:266)
        at java.util.concurrent.FutureTask.run(FutureTask.java)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.NullPointerException
        at org.apache.carbondata.core.scan.filter.resolver.RowLevelFilterResolverImpl.resolve(RowLevelFilterResolverImpl.java:74)
        at org.apache.carbondata.core.scan.filter.FilterExpressionProcessor.traverseAndResolveTree(FilterExpressionProcessor.java:252)
        at org.apache.carbondata.core.scan.filter.FilterExpressionProcessor.traverseAndResolveTree(FilterExpressionProcessor.java:251)
        at org.apache.carbondata.core.scan.filter.FilterExpressionProcessor.getFilterResolvertree(FilterExpressionProcessor.java:232)
        at org.apache.carbondata.core.scan.filter.FilterExpressionProcessor.getFilterResolver(FilterExpressionProcessor.java:84)
        at org.apache.carbondata.core.metadata.schema.table.CarbonTable.resolveFilter(CarbonTable.java:921)
        ... 16 more
18/05/15 11:27:04 ERROR CarbonSession: Exception when executing search mode: Error while resolving filter expression, fallback to SparkSQL
java.lang.RuntimeException: Error while resolving filter expression
        at org.apache.carbondata.core.metadata.schema.table.CarbonTable.resolveFilter(CarbonTable.java:924)
        at org.apache.spark.rpc.Master.chooseFGDataMap(Master.scala:261)
        at org.apache.spark.rpc.Master.search(Master.scala:218)
        at org.apache.carbondata.store.SparkCarbonStore.search(SparkCarbonStore.scala:144)
        at org.apache.spark.sql.CarbonSession.runSearch(CarbonSession.scala:225)
        at org.apache.spark.sql.CarbonSession.org$apache$spark$sql$CarbonSession$$trySearchMode(CarbonSession.scala:180)
        at org.apache.spark.sql.CarbonSession$$anonfun$sql$1.apply(CarbonSession.scala:100)
        at org.apache.spark.sql.CarbonSession$$anonfun$sql$1.apply(CarbonSession.scala:97)
        at org.apache.spark.sql.CarbonSession.withProfiler(CarbonSession.scala:156)
        at org.apache.spark.sql.CarbonSession.sql(CarbonSession.scala:95)
        at org.apache.carbondata.examples.SearchModeExample$$anonfun$3$$anon$1.run(SearchModeExample.scala:168)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run$$$capture(FutureTask.java:266)
        at java.util.concurrent.FutureTask.run(FutureTask.java)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.NullPointerException
        at org.apache.carbondata.core.scan.filter.resolver.RowLevelFilterResolverImpl.resolve(RowLevelFilterResolverImpl.java:74)
        at org.apache.carbondata.core.scan.filter.FilterExpressionProcessor.traverseAndResolveTree(FilterExpressionProcessor.java:252)
        at org.apache.carbondata.core.scan.filter.FilterExpressionProcessor.traverseAndResolveTree(FilterExpressionProcessor.java:251)
        at org.apache.carbondata.core.scan.filter.FilterExpressionProcessor.getFilterResolvertree(FilterExpressionProcessor.java:232)
        at org.apache.carbondata.core.scan.filter.FilterExpressionProcessor.getFilterResolver(FilterExpressionProcessor.java:84)
        at org.apache.carbondata.core.metadata.schema.table.CarbonTable.resolveFilter(CarbonTable.java:921)
        ... 16 more
{code}


> Search mode RuntimeException: Error while resolving filter expression
> ---------------------------------------------------------------------
>
>                 Key: CARBONDATA-2480
>                 URL: https://issues.apache.org/jira/browse/CARBONDATA-2480
>             Project: CarbonData
>          Issue Type: Bug
>            Reporter: xubo245
>            Assignee: xubo245
>            Priority: Major
>
> There are some error when running org.apache.carbondata.examples.SearchModeExample or org.apache.carbondata.spark.testsuite.detailquery.SearchModeTestCase
> {code:java}
> java.lang.RuntimeException: Error while resolving filter expression
> at org.apache.carbondata.core.metadata.schema.table.CarbonTable.resolveFilter(CarbonTable.java:924)
> at org.apache.spark.rpc.Master.chooseFGDataMap(Master.scala:261)
> at org.apache.spark.rpc.Master.search(Master.scala:218)
> at org.apache.carbondata.store.SparkCarbonStore.search(SparkCarbonStore.scala:144)
> at org.apache.spark.sql.CarbonSession.runSearch(CarbonSession.scala:225)
> at org.apache.spark.sql.CarbonSession.org$apache$spark$sql$CarbonSession$$trySearchMode(CarbonSession.scala:180)
> at org.apache.spark.sql.CarbonSession$$anonfun$sql$1.apply(CarbonSession.scala:100)
> at org.apache.spark.sql.CarbonSession$$anonfun$sql$1.apply(CarbonSession.scala:97)
> at org.apache.spark.sql.CarbonSession.withProfiler(CarbonSession.scala:156)
> at org.apache.spark.sql.CarbonSession.sql(CarbonSession.scala:95)
> at org.apache.carbondata.examples.SearchModeExample$$anonfun$3$$anon$1.run(SearchModeExample.scala:168)
> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
> at java.util.concurrent.FutureTask.run$$$capture(FutureTask.java:266)
> at java.util.concurrent.FutureTask.run(FutureTask.java)
> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> at java.lang.Thread.run(Thread.java:748)
> Caused by: java.lang.NullPointerException
> at org.apache.carbondata.core.scan.filter.resolver.RowLevelFilterResolverImpl.resolve(RowLevelFilterResolverImpl.java:74)
> at org.apache.carbondata.core.scan.filter.FilterExpressionProcessor.traverseAndResolveTree(FilterExpressionProcessor.java:252)
> at org.apache.carbondata.core.scan.filter.FilterExpressionProcessor.traverseAndResolveTree(FilterExpressionProcessor.java:251)
> at org.apache.carbondata.core.scan.filter.FilterExpressionProcessor.getFilterResolvertree(FilterExpressionProcessor.java:232)
> at org.apache.carbondata.core.scan.filter.FilterExpressionProcessor.getFilterResolver(FilterExpressionProcessor.java:84)
> at org.apache.carbondata.core.metadata.schema.table.CarbonTable.resolveFilter(CarbonTable.java:921)
> ... 16 more
> 18/05/15 11:27:00 ERROR CarbonSession: Exception when executing search mode: Error while resolving filter expression, fallback to SparkSQL
> search mode asynchronous query
> java.lang.RuntimeException: Error while resolving filter expression
> at org.apache.carbondata.core.metadata.schema.table.CarbonTable.resolveFilter(CarbonTable.java:924)
> at org.apache.spark.rpc.Master.chooseFGDataMap(Master.scala:261)
> at org.apache.spark.rpc.Master.search(Master.scala:218)
> at org.apache.carbondata.store.SparkCarbonStore.search(SparkCarbonStore.scala:144)
> at org.apache.spark.sql.CarbonSession.runSearch(CarbonSession.scala:225)
> at org.apache.spark.sql.CarbonSession.org$apache$spark$sql$CarbonSession$$trySearchMode(CarbonSession.scala:180)
> at org.apache.spark.sql.CarbonSession$$anonfun$sql$1.apply(CarbonSession.scala:100)
> at org.apache.spark.sql.CarbonSession$$anonfun$sql$1.apply(CarbonSession.scala:97)
> at org.apache.spark.sql.CarbonSession.withProfiler(CarbonSession.scala:156)
> at org.apache.spark.sql.CarbonSession.sql(CarbonSession.scala:95)
> at org.apache.carbondata.examples.SearchModeExample$$anonfun$3$$anon$1.run(SearchModeExample.scala:168)
> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
> at java.util.concurrent.FutureTask.run$$$capture(FutureTask.java:266)
> at java.util.concurrent.FutureTask.run(FutureTask.java)
> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> at java.lang.Thread.run(Thread.java:748)
> Caused by: java.lang.NullPointerException
> at org.apache.carbondata.core.scan.filter.resolver.RowLevelFilterResolverImpl.resolve(RowLevelFilterResolverImpl.java:74)
> at org.apache.carbondata.core.scan.filter.FilterExpressionProcessor.traverseAndResolveTree(FilterExpressionProcessor.java:252)
> at org.apache.carbondata.core.scan.filter.FilterExpressionProcessor.traverseAndResolveTree(FilterExpressionProcessor.java:251)
> at org.apache.carbondata.core.scan.filter.FilterExpressionProcessor.getFilterResolvertree(FilterExpressionProcessor.java:232)
> at org.apache.carbondata.core.scan.filter.FilterExpressionProcessor.getFilterResolver(FilterExpressionProcessor.java:84)
> at org.apache.carbondata.core.metadata.schema.table.CarbonTable.resolveFilter(CarbonTable.java:921)
> ... 16 more
> 18/05/15 11:27:03 ERROR CarbonSession: Exception when executing search mode: Error while resolving filter expression, fallback to SparkSQL
> java.lang.RuntimeException: Error while resolving filter expression
> at org.apache.carbondata.core.metadata.schema.table.CarbonTable.resolveFilter(CarbonTable.java:924)
> at org.apache.spark.rpc.Master.chooseFGDataMap(Master.scala:261)
> at org.apache.spark.rpc.Master.search(Master.scala:218)
> at org.apache.carbondata.store.SparkCarbonStore.search(SparkCarbonStore.scala:144)
> at org.apache.spark.sql.CarbonSession.runSearch(CarbonSession.scala:225)
> at org.apache.spark.sql.CarbonSession.org$apache$spark$sql$CarbonSession$$trySearchMode(CarbonSession.scala:180)
> at org.apache.spark.sql.CarbonSession$$anonfun$sql$1.apply(CarbonSession.scala:100)
> at org.apache.spark.sql.CarbonSession$$anonfun$sql$1.apply(CarbonSession.scala:97)
> at org.apache.spark.sql.CarbonSession.withProfiler(CarbonSession.scala:156)
> at org.apache.spark.sql.CarbonSession.sql(CarbonSession.scala:95)
> at org.apache.carbondata.examples.SearchModeExample$$anonfun$3$$anon$1.run(SearchModeExample.scala:168)
> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
> at java.util.concurrent.FutureTask.run$$$capture(FutureTask.java:266)
> at java.util.concurrent.FutureTask.run(FutureTask.java)
> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> at java.lang.Thread.run(Thread.java:748)
> Caused by: java.lang.NullPointerException
> at org.apache.carbondata.core.scan.filter.resolver.RowLevelFilterResolverImpl.resolve(RowLevelFilterResolverImpl.java:74)
> at org.apache.carbondata.core.scan.filter.FilterExpressionProcessor.traverseAndResolveTree(FilterExpressionProcessor.java:252)
> at org.apache.carbondata.core.scan.filter.FilterExpressionProcessor.traverseAndResolveTree(FilterExpressionProcessor.java:251)
> at org.apache.carbondata.core.scan.filter.FilterExpressionProcessor.getFilterResolvertree(FilterExpressionProcessor.java:232)
> at org.apache.carbondata.core.scan.filter.FilterExpressionProcessor.getFilterResolver(FilterExpressionProcessor.java:84)
> at org.apache.carbondata.core.metadata.schema.table.CarbonTable.resolveFilter(CarbonTable.java:921)
> ... 16 more
> 18/05/15 11:27:04 ERROR CarbonSession: Exception when executing search mode: Error while resolving filter expression, fallback to SparkSQL
> java.lang.RuntimeException: Error while resolving filter expression
> at org.apache.carbondata.core.metadata.schema.table.CarbonTable.resolveFilter(CarbonTable.java:924)
> at org.apache.spark.rpc.Master.chooseFGDataMap(Master.scala:261)
> at org.apache.spark.rpc.Master.search(Master.scala:218)
> at org.apache.carbondata.store.SparkCarbonStore.search(SparkCarbonStore.scala:144)
> at org.apache.spark.sql.CarbonSession.runSearch(CarbonSession.scala:225)
> at org.apache.spark.sql.CarbonSession.org$apache$spark$sql$CarbonSession$$trySearchMode(CarbonSession.scala:180)
> at org.apache.spark.sql.CarbonSession$$anonfun$sql$1.apply(CarbonSession.scala:100)
> at org.apache.spark.sql.CarbonSession$$anonfun$sql$1.apply(CarbonSession.scala:97)
> at org.apache.spark.sql.CarbonSession.withProfiler(CarbonSession.scala:156)
> at org.apache.spark.sql.CarbonSession.sql(CarbonSession.scala:95)
> at org.apache.carbondata.examples.SearchModeExample$$anonfun$3$$anon$1.run(SearchModeExample.scala:168)
> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
> at java.util.concurrent.FutureTask.run$$$capture(FutureTask.java:266)
> at java.util.concurrent.FutureTask.run(FutureTask.java)
> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> at java.lang.Thread.run(Thread.java:748)
> Caused by: java.lang.NullPointerException
> at org.apache.carbondata.core.scan.filter.resolver.RowLevelFilterResolverImpl.resolve(RowLevelFilterResolverImpl.java:74)
> at org.apache.carbondata.core.scan.filter.FilterExpressionProcessor.traverseAndResolveTree(FilterExpressionProcessor.java:252)
> at org.apache.carbondata.core.scan.filter.FilterExpressionProcessor.traverseAndResolveTree(FilterExpressionProcessor.java:251)
> at org.apache.carbondata.core.scan.filter.FilterExpressionProcessor.getFilterResolvertree(FilterExpressionProcessor.java:232)
> at org.apache.carbondata.core.scan.filter.FilterExpressionProcessor.getFilterResolver(FilterExpressionProcessor.java:84)
> at org.apache.carbondata.core.metadata.schema.table.CarbonTable.resolveFilter(CarbonTable.java:921)
> ... 16 more
> 18/05/15 11:27:04 ERROR CarbonSession: Exception when executing search mode: Error while resolving filter expression, fallback to SparkSQL
> java.lang.RuntimeException: Error while resolving filter expression
> at org.apache.carbondata.core.metadata.schema.table.CarbonTable.resolveFilter(CarbonTable.java:924)
> at org.apache.spark.rpc.Master.chooseFGDataMap(Master.scala:261)
> at org.apache.spark.rpc.Master.search(Master.scala:218)
> at org.apache.carbondata.store.SparkCarbonStore.search(SparkCarbonStore.scala:144)
> at org.apache.spark.sql.CarbonSession.runSearch(CarbonSession.scala:225)
> at org.apache.spark.sql.CarbonSession.org$apache$spark$sql$CarbonSession$$trySearchMode(CarbonSession.scala:180)
> at org.apache.spark.sql.CarbonSession$$anonfun$sql$1.apply(CarbonSession.scala:100)
> at org.apache.spark.sql.CarbonSession$$anonfun$sql$1.apply(CarbonSession.scala:97)
> at org.apache.spark.sql.CarbonSession.withProfiler(CarbonSession.scala:156)
> at org.apache.spark.sql.CarbonSession.sql(CarbonSession.scala:95)
> at org.apache.carbondata.examples.SearchModeExample$$anonfun$3$$anon$1.run(SearchModeExample.scala:168)
> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
> at java.util.concurrent.FutureTask.run$$$capture(FutureTask.java:266)
> at java.util.concurrent.FutureTask.run(FutureTask.java)
> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> at java.lang.Thread.run(Thread.java:748)
> Caused by: java.lang.NullPointerException
> at org.apache.carbondata.core.scan.filter.resolver.RowLevelFilterResolverImpl.resolve(RowLevelFilterResolverImpl.java:74)
> at org.apache.carbondata.core.scan.filter.FilterExpressionProcessor.traverseAndResolveTree(FilterExpressionProcessor.java:252)
> at org.apache.carbondata.core.scan.filter.FilterExpressionProcessor.traverseAndResolveTree(FilterExpressionProcessor.java:251)
> at org.apache.carbondata.core.scan.filter.FilterExpressionProcessor.getFilterResolvertree(FilterExpressionProcessor.java:232)
> at org.apache.carbondata.core.scan.filter.FilterExpressionProcessor.getFilterResolver(FilterExpressionProcessor.java:84)
> at org.apache.carbondata.core.metadata.schema.table.CarbonTable.resolveFilter(CarbonTable.java:921)
> ... 16 more
> {code}
> The second error :
> {code:java}
> 18/05/14 20:51:20 ERROR TransportRequestHandler: Error while invoking RpcHandler#receive() on RPC id 6929993627477870468
> java.io.InvalidClassException: scala.collection.convert.Wrappers$MutableSetWrapper; no valid constructor
> at java.io.ObjectStreamClass$ExceptionInfo.newInvalidClassException(ObjectStreamClass.java:157)
> at java.io.ObjectStreamClass.checkDeserialize(ObjectStreamClass.java:862)
> at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2041)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1571)
> at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2285)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2209)
> at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2067)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1571)
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:431)
> at java.util.ArrayList.readObject(ArrayList.java:797)
> at sun.reflect.GeneratedMethodAccessor53.invoke(Unknown Source)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1158)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2176)
> at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2067)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1571)
> at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2285)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2209)
> at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2067)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1571)
> at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2285)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2209)
> at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2067)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1571)
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:431)
> at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75)
> at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:108)
> at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1$$anonfun$apply$1.apply(NettyRpcEnv.scala:270)
> at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
> at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:319)
> at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1.apply(NettyRpcEnv.scala:269)
> at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
> at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:268)
> at org.apache.spark.rpc.netty.RequestMessage$.apply(NettyRpcEnv.scala:603)
> at org.apache.spark.rpc.netty.NettyRpcHandler.internalReceive(NettyRpcEnv.scala:654)
> at org.apache.spark.rpc.netty.NettyRpcHandler.receive(NettyRpcEnv.scala:639)
> at org.apache.spark.network.server.TransportRequestHandler.processRpcRequest(TransportRequestHandler.java:157)
> at org.apache.spark.network.server.TransportRequestHandler.handle(TransportRequestHandler.java:105)
> at org.apache.spark.network.server.TransportChannelHandler.channelRead(TransportChannelHandler.java:118)
> at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
> at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
> at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346)
> at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:266)
> at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
> at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
> at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346)
> at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
> at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
> at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
> at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346)
> at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:85)
> at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
> at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
> at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346)
> at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294)
> at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
> at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
> at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911)
> at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
> at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:652)
> at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:575)
> at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:489)
> at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:451)
> at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:140)
> at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
> at java.lang.Thread.run(Thread.java:748)
> org.apache.spark.SparkException: Exception thrown in awaitResult:
> at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:205)
> at org.apache.spark.rpc.Master$$anonfun$search$1.apply(Master.scala:236)
> at org.apache.spark.rpc.Master$$anonfun$search$1.apply(Master.scala:231)
> at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
> at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
> at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
> at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
> at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
> at org.apache.spark.rpc.Master.search(Master.scala:231)
> at org.apache.carbondata.store.SparkCarbonStore.search(SparkCarbonStore.scala:144)
> at org.apache.spark.sql.CarbonSession.runSearch(CarbonSession.scala:225)
> at org.apache.spark.sql.CarbonSession.org$apache$spark$sql$CarbonSession$$trySearchMode(CarbonSession.scala:186)
> at org.apache.spark.sql.CarbonSession$$anonfun$sql$1.apply(CarbonSession.scala:100)
> at org.apache.spark.sql.CarbonSession$$anonfun$sql$1.apply(CarbonSession.scala:97)
> at org.apache.spark.sql.CarbonSession.withProfiler(CarbonSession.scala:156)
> at org.apache.spark.sql.CarbonSession.sql(CarbonSession.scala:95)
> at org.apache.spark.sql.test.Spark2TestQueryExecutor.sql(Spark2TestQueryExecutor.scala:35)
> at org.apache.spark.sql.test.util.QueryTest.sql(QueryTest.scala:113)
> at org.apache.carbondata.spark.testsuite.detailquery.SearchModeTestCase.org$apache$carbondata$spark$testsuite$detailquery$SearchModeTestCase$$checkSearchAnswer(SearchModeTestCase.scala:58)
> at org.apache.carbondata.spark.testsuite.detailquery.SearchModeTestCase$$anonfun$10$$anonfun$apply$mcV$sp$1.apply$mcV$sp(SearchModeTestCase.scala:108)
> at org.apache.carbondata.spark.testsuite.detailquery.SearchModeTestCase$$anonfun$10$$anonfun$apply$mcV$sp$1.apply(SearchModeTestCase.scala:105)
> at org.apache.carbondata.spark.testsuite.detailquery.SearchModeTestCase$$anonfun$10$$anonfun$apply$mcV$sp$1.apply(SearchModeTestCase.scala:105)
> at org.apache.spark.sql.catalyst.util.package$.benchmark(package.scala:129)
> at org.apache.carbondata.spark.testsuite.detailquery.SearchModeTestCase$$anonfun$10.apply$mcV$sp(SearchModeTestCase.scala:105)
> at org.apache.carbondata.spark.testsuite.detailquery.SearchModeTestCase$$anonfun$10.apply(SearchModeTestCase.scala:103)
> at org.apache.carbondata.spark.testsuite.detailquery.SearchModeTestCase$$anonfun$10.apply(SearchModeTestCase.scala:103)
> at org.scalatest.Transformer$$anonfun$apply$1.apply$mcV$sp(Transformer.scala:22)
> at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
> at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
> at org.scalatest.Transformer.apply(Transformer.scala:22)
> at org.scalatest.Transformer.apply(Transformer.scala:20)
> at org.scalatest.FunSuiteLike$$anon$1.apply(FunSuiteLike.scala:166)
> at org.apache.spark.sql.test.util.CarbonFunSuite.withFixture(CarbonFunSuite.scala:41)
> at org.scalatest.FunSuiteLike$class.invokeWithFixture$1(FunSuiteLike.scala:163)
> at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:175)
> at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:175)
> at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306)
> at org.scalatest.FunSuiteLike$class.runTest(FunSuiteLike.scala:175)
> at org.scalatest.FunSuite.runTest(FunSuite.scala:1555)
> at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:208)
> at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:208)
> at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:413)
> at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:401)
> at scala.collection.immutable.List.foreach(List.scala:381)
> at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401)
> at org.scalatest.SuperEngine.org$scalatest$SuperEngine$$runTestsInBranch(Engine.scala:396)
> at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:483)
> at org.scalatest.FunSuiteLike$class.runTests(FunSuiteLike.scala:208)
> at org.scalatest.FunSuite.runTests(FunSuite.scala:1555)
> at org.scalatest.Suite$class.run(Suite.scala:1424)
> at org.scalatest.FunSuite.org$scalatest$FunSuiteLike$$super$run(FunSuite.scala:1555)
> at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:212)
> at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:212)
> at org.scalatest.SuperEngine.runImpl(Engine.scala:545)
> at org.scalatest.FunSuiteLike$class.run(FunSuiteLike.scala:212)
> at org.apache.carbondata.spark.testsuite.detailquery.SearchModeTestCase.org$scalatest$BeforeAndAfterAll$$super$run(SearchModeTestCase.scala:31)
> at org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:257)
> at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:256)
> at org.apache.carbondata.spark.testsuite.detailquery.SearchModeTestCase.run(SearchModeTestCase.scala:31)
> at org.scalatest.tools.SuiteRunner.run(SuiteRunner.scala:55)
> at org.scalatest.tools.Runner$$anonfun$doRunRunRunDaDoRunRun$3.apply(Runner.scala:2563)
> at org.scalatest.tools.Runner$$anonfun$doRunRunRunDaDoRunRun$3.apply(Runner.scala:2557)
> at scala.collection.immutable.List.foreach(List.scala:381)
> at org.scalatest.tools.Runner$.doRunRunRunDaDoRunRun(Runner.scala:2557)
> at org.scalatest.tools.Runner$$anonfun$runOptionallyWithPassFailReporter$2.apply(Runner.scala:1044)
> at org.scalatest.tools.Runner$$anonfun$runOptionallyWithPassFailReporter$2.apply(Runner.scala:1043)
> at org.scalatest.tools.Runner$.withClassLoaderAndDispatchReporter(Runner.scala:2722)
> at org.scalatest.tools.Runner$.runOptionallyWithPassFailReporter(Runner.scala:1043)
> at org.scalatest.tools.Runner$.run(Runner.scala:883)
> at org.scalatest.tools.Runner.run(Runner.scala)
> at org.jetbrains.plugins.scala.testingSupport.scalaTest.ScalaTestRunner.runScalaTest2(ScalaTestRunner.java:131)
> at org.jetbrains.plugins.scala.testingSupport.scalaTest.ScalaTestRunner.main(ScalaTestRunner.java:28)
> Caused by: java.lang.RuntimeException: java.io.InvalidClassException: scala.collection.convert.Wrappers$MutableSetWrapper; no valid constructor
> at java.io.ObjectStreamClass$ExceptionInfo.newInvalidClassException(ObjectStreamClass.java:157)
> at java.io.ObjectStreamClass.checkDeserialize(ObjectStreamClass.java:862)
> at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2041)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1571)
> at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2285)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2209)
> at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2067)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1571)
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:431)
> at java.util.ArrayList.readObject(ArrayList.java:797)
> at sun.reflect.GeneratedMethodAccessor53.invoke(Unknown Source)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1158)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2176)
> at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2067)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1571)
> at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2285)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2209)
> at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2067)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1571)
> at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2285)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2209)
> at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2067)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1571)
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:431)
> at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75)
> at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:108)
> at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1$$anonfun$apply$1.apply(NettyRpcEnv.scala:270)
> at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
> at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:319)
> at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1.apply(NettyRpcEnv.scala:269)
> at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
> at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:268)
> at org.apache.spark.rpc.netty.RequestMessage$.apply(NettyRpcEnv.scala:603)
> at org.apache.spark.rpc.netty.NettyRpcHandler.internalReceive(NettyRpcEnv.scala:654)
> at org.apache.spark.rpc.netty.NettyRpcHandler.receive(NettyRpcEnv.scala:639)
> at org.apache.spark.network.server.TransportRequestHandler.processRpcRequest(TransportRequestHandler.java:157)
> at org.apache.spark.network.server.TransportRequestHandler.handle(TransportRequestHandler.java:105)
> at org.apache.spark.network.server.TransportChannelHandler.channelRead(TransportChannelHandler.java:118)
> at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
> at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
> at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346)
> at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:266)
> at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
> at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
> at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346)
> at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
> at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
> at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
> at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346)
> at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:85)
> at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
> at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
> at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346)
> at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294)
> at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
> at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
> at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911)
> at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
> at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:652)
> at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:575)
> at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:489)
> at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:451)
> at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:140)
> at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
> at java.lang.Thread.run(Thread.java:748)
> at org.apache.spark.network.client.TransportResponseHandler.handle(TransportResponseHandler.java:207)
> at org.apache.spark.network.server.TransportChannelHandler.channelRead(TransportChannelHandler.java:120)
> at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
> at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
> at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346)
> at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:266)
> at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
> at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
> at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346)
> at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
> at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
> at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
> at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346)
> at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:85)
> at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
> at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
> at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346)
> at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294)
> at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
> at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
> at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911)
> at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
> at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:652)
> at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:575)
> at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:489)
> at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:451)
> at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:140)
> at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
> at java.lang.Thread.run(Thread.java:748)
> 18/05/14 20:51:20 ERROR CarbonSession: Exception when executing search mode: Exception thrown in awaitResult: , fallback to SparkSQL
> 18/05/14 20:51:20 INFO CarbonSparkSqlParser: Parsing command: select id from main where id = '3' limit 10
> 18/05/14 20:51:20 INFO HiveMetaStore: 0: get_table : db=default tbl=main
> 18/05/14 20:51:20 INFO audit: ugi=xubo ip=unknown-ip-addr cmd=get_table : db=default tbl=main
> 18/05/14 20:51:20 INFO CatalystSqlParser: Parsing command: array<string>
> 18/05/14 20:51:20 INFO CarbonLateDecodeRule: ScalaTest-run-running-SearchModeTestCase skip CarbonOptimizer
> 18/05/14 20:51:20 INFO CarbonLateDecodeRule: ScalaTest-run-running-SearchModeTestCase Skip CarbonOptimizer
> 18/05/14 20:51:20 INFO CodeGenerator: Code generated in 4.758886 ms
> 18/05/14 20:51:20 INFO DistributionUtil$: ScalaTest-run-running-SearchModeTestCase Executors configured : 1
> 18/05/14 20:51:20 INFO DistributionUtil$: ScalaTest-run-running-SearchModeTestCase Total Time taken to ensure the required executors : 1
> 18/05/14 20:51:20 INFO DistributionUtil$: ScalaTest-run-running-SearchModeTestCase Time elapsed to allocate the required executors: 0
> 18/05/14 20:51:20 INFO CarbonScanRDD:
>  Identified no.of.blocks: 1,
>  no.of.tasks: 1,
>  no.of.nodes: 1,
>  parallelism: 0
>        
> 18/05/14 20:51:20 INFO SparkContext: Starting job: collect at SearchModeTestCase.scala:54
> 18/05/14 20:51:20 INFO DAGScheduler: Got job 17 (collect at SearchModeTestCase.scala:54) with 1 output partitions
> 18/05/14 20:51:20 INFO DAGScheduler: Final stage: ResultStage 23 (collect at SearchModeTestCase.scala:54)
> 18/05/14 20:51:20 INFO DAGScheduler: Parents of final stage: List()
> 18/05/14 20:51:20 INFO DAGScheduler: Missing parents: List()
> 18/05/14 20:51:20 INFO DAGScheduler: Submitting ResultStage 23 (MapPartitionsRDD[80] at collect at SearchModeTestCase.scala:54), which has no missing parents
> 18/05/14 20:51:20 INFO MemoryStore: Block broadcast_23 stored as values in memory (estimated size 32.1 KB, free 2004.0 MB)
> 18/05/14 20:51:20 INFO MemoryStore: Block broadcast_23_piece0 stored as bytes in memory (estimated size 26.5 KB, free 2004.0 MB)
> 18/05/14 20:51:20 INFO BlockManagerInfo: Added broadcast_23_piece0 in memory on 192.168.44.90:63398 (size: 26.5 KB, free: 2004.3 MB)
> 18/05/14 20:51:20 INFO SparkContext: Created broadcast 23 from broadcast at DAGScheduler.scala:1006
> 18/05/14 20:51:20 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 23 (MapPartitionsRDD[80] at collect at SearchModeTestCase.scala:54) (first 15 tasks are for partitions Vector(0))
> 18/05/14 20:51:20 INFO TaskSchedulerImpl: Adding task set 23.0 with 1 tasks
> 18/05/14 20:51:20 INFO TaskSetManager: Starting task 0.0 in stage 23.0 (TID 822, localhost, executor driver, partition 0, PROCESS_LOCAL, 5955 bytes)
> 18/05/14 20:51:20 INFO Executor: Running task 0.0 in stage 23.0 (TID 822)
> 18/05/14 20:51:20 INFO AbstractQueryExecutor: [Executor task launch worker for task 822][partitionID:main;queryID:65652430146138] Query will be executed on table: main
> 18/05/14 20:51:20 INFO ResultCollectorFactory: pool-36-thread-7 Row based dictionary collector is used to scan and collect the data
> 18/05/14 20:51:20 INFO UnsafeMemoryManager: [Executor task launch worker for task 822][partitionID:main;queryID:65652430146138] Total memory used after task 65652473133085 is 3687 Current tasks running now are : [65647314993698, 65639014191902, 65647481932757, 65644793091292, 65639048545043, 65641096541779]
> 18/05/14 20:51:20 INFO Executor: Finished task 0.0 in stage 23.0 (TID 822). 1162 bytes result sent to driver
> 18/05/14 20:51:20 INFO TaskSetManager: Finished task 0.0 in stage 23.0 (TID 822) in 27 ms on localhost (executor driver) (1/1)
> 18/05/14 20:51:20 INFO TaskSchedulerImpl: Removed TaskSet 23.0, whose tasks have all completed, from pool
> 18/05/14 20:51:20 INFO DAGScheduler: ResultStage 23 (collect at SearchModeTestCase.scala:54) finished in 0.027 s
> 18/05/14 20:51:20 INFO DAGScheduler: Job 17 finished: collect at SearchModeTestCase.scala:54, took 0.034065 s
> 18/05/14 20:51:20 INFO CarbonLateDecodeRule: ScalaTest-run-running-SearchModeTestCase skip CarbonOptimizer
> 18/05/14 20:51:20 INFO CarbonLateDecodeRule: ScalaTest-run-running-SearchModeTestCase Skip CarbonOptimizer
> 18/05/14 20:51:20 INFO DistributionUtil$: ScalaTest-run-running-SearchModeTestCase Executors configured : 1
> 18/05/14 20:51:20 INFO DistributionUtil$: ScalaTest-run-running-SearchModeTestCase Total Time taken to ensure the required executors : 0
> 18/05/14 20:51:20 INFO DistributionUtil$: ScalaTest-run-running-SearchModeTestCase Time elapsed to allocate the required executors: 0
> 18/05/14 20:51:20 INFO CarbonScanRDD:
>  Identified no.of.blocks: 1,
>  no.of.tasks: 1,
>  no.of.nodes: 1,
>  parallelism: 0
>        
> 18/05/14 20:51:20 INFO SparkContext: Starting job: checkAnswer at SearchModeTestCase.scala:58
> 18/05/14 20:51:20 INFO DAGScheduler: Got job 18 (checkAnswer at SearchModeTestCase.scala:58) with 1 output partitions
> 18/05/14 20:51:20 INFO DAGScheduler: Final stage: ResultStage 24 (checkAnswer at SearchModeTestCase.scala:58)
> 18/05/14 20:51:20 INFO DAGScheduler: Parents of final stage: List()
> 18/05/14 20:51:20 INFO DAGScheduler: Missing parents: List()
> 18/05/14 20:51:20 INFO DAGScheduler: Submitting ResultStage 24 (MapPartitionsRDD[83] at checkAnswer at SearchModeTestCase.scala:58), which has no missing parents
> 18/05/14 20:51:20 INFO MemoryStore: Block broadcast_24 stored as values in memory (estimated size 32.1 KB, free 2003.9 MB)
> 18/05/14 20:51:20 INFO MemoryStore: Block broadcast_24_piece0 stored as bytes in memory (estimated size 26.5 KB, free 2003.9 MB)
> 18/05/14 20:51:20 INFO BlockManagerInfo: Added broadcast_24_piece0 in memory on 192.168.44.90:63398 (size: 26.5 KB, free: 2004.3 MB)
> 18/05/14 20:51:20 INFO SparkContext: Created broadcast 24 from broadcast at DAGScheduler.scala:1006
> 18/05/14 20:51:20 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 24 (MapPartitionsRDD[83] at checkAnswer at SearchModeTestCase.scala:58) (first 15 tasks are for partitions Vector(0))
> 18/05/14 20:51:20 INFO TaskSchedulerImpl: Adding task set 24.0 with 1 tasks
> 18/05/14 20:51:20 INFO TaskSetManager: Starting task 0.0 in stage 24.0 (TID 823, localhost, executor driver, partition 0, PROCESS_LOCAL, 5955 bytes)
> 18/05/14 20:51:20 INFO Executor: Running task 0.0 in stage 24.0 (TID 823)
> 18/05/14 20:51:20 INFO AbstractQueryExecutor: [Executor task launch worker for task 823][partitionID:main;queryID:65652512653847] Query will be executed on table: main
> 18/05/14 20:51:20 INFO ResultCollectorFactory: pool-36-thread-6 Row based dictionary collector is used to scan and collect the data
> 18/05/14 20:51:20 INFO UnsafeMemoryManager: [Executor task launch worker for task 823][partitionID:main;queryID:65652512653847] Total memory used after task 65652542497559 is 3687 Current tasks running now are : [65647314993698, 65639014191902, 65647481932757, 65644793091292, 65639048545043, 65641096541779]
> 18/05/14 20:51:20 INFO Executor: Finished task 0.0 in stage 24.0 (TID 823). 1162 bytes result sent to driver
> 18/05/14 20:51:20 INFO TaskSetManager: Finished task 0.0 in stage 24.0 (TID 823) in 29 ms on localhost (executor driver) (1/1)
> 18/05/14 20:51:20 INFO TaskSchedulerImpl: Removed TaskSet 24.0, whose tasks have all completed, from pool
> 18/05/14 20:51:20 INFO DAGScheduler: ResultStage 24 (checkAnswer at SearchModeTestCase.scala:58) finished in 0.030 s
> 18/05/14 20:51:20 INFO DAGScheduler: Job 18 finished: checkAnswer at SearchModeTestCase.scala:58, took 0.033859 s
> 18/05/14 20:51:20 INFO CarbonSparkSqlParser: Parsing command: set carbon.search.enabled = false
> 18/05/14 20:51:20 INFO CarbonLateDecodeRule: ScalaTest-run-running-SearchModeTestCase skip CarbonOptimizer
> 18/05/14 20:51:20 INFO CarbonLateDecodeRule: ScalaTest-run-running-SearchModeTestCase Skip CarbonOptimizer
> 18/05/14 20:51:20 AUDIT CacheProvider: [localhost][xubo][Thread-1]The key carbon.search.enabled with value false added in the session param
> 18/05/14 20:51:20 INFO SparkCarbonStore: ScalaTest-run-running-SearchModeTestCase Shutting down all workers...
> 18/05/14 20:51:20 INFO SearchRequestHandler: [dispatcher-event-loop-0][partitionID:main;queryID:65643146221485] Shutting down worker...
> 18/05/14 20:51:20 INFO SearchRequestHandler: [dispatcher-event-loop-0][partitionID:main;queryID:65643146221485] Worker shutted down
> 18/05/14 20:51:20 INFO SparkCarbonStore: ScalaTest-run-running-SearchModeTestCase All workers are shutted down
> 18/05/14 20:51:20 INFO SparkCarbonStore: ScalaTest-run-running-SearchModeTestCase Stopping master...
> 18/05/14 20:51:20 INFO Registry: dispatcher-event-loop-1 Registry Endpoint stopped
> 18/05/14 20:51:20 INFO SparkCarbonStore: ScalaTest-run-running-SearchModeTestCase Master stopped
> 18/05/14 20:51:20 INFO SearchModeTestCase: ScalaTest-run-running-SearchModeTestCase
> ===== FINISHED org.apache.carbondata.spark.testsuite.detailquery.SearchModeTestCase: 'set search mode' =====
> 248.566341ms
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)