Block B-tree loading failed

classic Classic list List threaded Threaded
2 messages Options
Reply | Threaded
Open this post in threaded view
|

Block B-tree loading failed

ericzgy
Now when I load data into CarbonData table using spark1.6.2 and carbondata1.0.0,The problem details are as follows:  

WARN  28-02 15:15:33,154 - Lost task 15.0 in stage 5.0 (TID 139, halu062): org.apache.carbondata.core.datastore.exception.IndexBuilderException: Block B-tree loading failed
        at org.apache.carbondata.core.datastore.BlockIndexStore.fillLoadedBlocks(BlockIndexStore.java:264)
        at org.apache.carbondata.core.datastore.BlockIndexStore.getAll(BlockIndexStore.java:189)
        at org.apache.carbondata.core.scan.executor.impl.AbstractQueryExecutor.initQuery(AbstractQueryExecutor.java:130)
        at org.apache.carbondata.core.scan.executor.impl.AbstractQueryExecutor.getBlockExecutionInfos(AbstractQueryExecutor.java:219)
        at org.apache.carbondata.core.scan.executor.impl.DetailQueryExecutor.execute(DetailQueryExecutor.java:39)
        at org.apache.carbondata.hadoop.CarbonRecordReader.initialize(CarbonRecordReader.java:79)
        at org.apache.carbondata.spark.rdd.CarbonScanRDD.compute(CarbonScanRDD.scala:192)
        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
        at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
        at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
        at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
        at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
        at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
        at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
        at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
        at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
        at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
        at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
        at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
        at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
        at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
        at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
        at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
        at org.apache.spark.scheduler.Task.run(Task.scala:89)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:227)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)
Caused by: java.util.concurrent.ExecutionException: java.io.IOException: org.apache.thrift.protocol.TProtocolException: don't know what type: 14
        at java.util.concurrent.FutureTask.report(FutureTask.java:122)
        at java.util.concurrent.FutureTask.get(FutureTask.java:192)
        at org.apache.carbondata.core.datastore.BlockIndexStore.fillLoadedBlocks(BlockIndexStore.java:254)
        ... 35 more
Caused by: java.io.IOException: org.apache.thrift.protocol.TProtocolException: don't know what type: 14
        at org.apache.carbondata.core.reader.ThriftReader.read(ThriftReader.java:108)
        at org.apache.carbondata.core.reader.CarbonFooterReader.readFooter(CarbonFooterReader.java:54)
        at org.apache.carbondata.core.util.DataFileFooterConverter2.readDataFileFooter(DataFileFooterConverter2.java:47)
        at org.apache.carbondata.core.util.CarbonUtil.readMetadatFile(CarbonUtil.java:848)
        at org.apache.carbondata.core.datastore.AbstractBlockIndexStoreCache.checkAndLoadTableBlocks(AbstractBlockIndexStoreCache.java:98)
        at org.apache.carbondata.core.datastore.BlockIndexStore.loadBlock(BlockIndexStore.java:304)
        at org.apache.carbondata.core.datastore.BlockIndexStore.get(BlockIndexStore.java:109)
        at org.apache.carbondata.core.datastore.BlockIndexStore$BlockLoaderThread.call(BlockIndexStore.java:294)
        at org.apache.carbondata.core.datastore.BlockIndexStore$BlockLoaderThread.call(BlockIndexStore.java:284)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        ... 3 more
Caused by: org.apache.thrift.protocol.TProtocolException: don't know what type: 14
        at org.apache.thrift.protocol.TCompactProtocol.getTType(TCompactProtocol.java:896)
        at org.apache.thrift.protocol.TCompactProtocol.readListBegin(TCompactProtocol.java:596)
        at org.apache.thrift.protocol.TProtocolUtil.skip(TProtocolUtil.java:136)
        at org.apache.thrift.protocol.TProtocolUtil.skip(TProtocolUtil.java:60)
        at org.apache.carbondata.format.FileFooter$FileFooterStandardScheme.read(FileFooter.java:1113)
        at org.apache.carbondata.format.FileFooter$FileFooterStandardScheme.read(FileFooter.java:990)
        at org.apache.carbondata.format.FileFooter.read(FileFooter.java:867)
        at org.apache.carbondata.core.reader.ThriftReader.read(ThriftReader.java:106)
        ... 12 more

INFO  28-02 15:15:33,156 - Starting task 15.1 in stage 5.0 (TID 142, halu062, partition 15,NODE_LOCAL, 16560 bytes)
Reply | Threaded
Open this post in threaded view
|

Re: Block B-tree loading failed

ravipesala
Hi,

Have you loaded data freshly and try to execute the query? Or you are
trying to query the old store you already has loaded?

Regards,
Ravindra.

On 28 February 2017 at 17:20, ericzgy <[hidden email]> wrote:

> Now when I load data into CarbonData table using spark1.6.2 and
> carbondata1.0.0,The problem details are as follows:
>
> WARN  28-02 15:15:33,154 - Lost task 15.0 in stage 5.0 (TID 139, halu062):
> org.apache.carbondata.core.datastore.exception.IndexBuilderException:
> Block
> B-tree loading failed
>         at
> org.apache.carbondata.core.datastore.BlockIndexStore.fillLoadedBlocks(
> BlockIndexStore.java:264)
>         at
> org.apache.carbondata.core.datastore.BlockIndexStore.
> getAll(BlockIndexStore.java:189)
>         at
> org.apache.carbondata.core.scan.executor.impl.AbstractQueryExecutor.
> initQuery(AbstractQueryExecutor.java:130)
>         at
> org.apache.carbondata.core.scan.executor.impl.AbstractQueryExecutor.
> getBlockExecutionInfos(AbstractQueryExecutor.java:219)
>         at
> org.apache.carbondata.core.scan.executor.impl.DetailQueryExecutor.execute(
> DetailQueryExecutor.java:39)
>         at
> org.apache.carbondata.hadoop.CarbonRecordReader.initialize(
> CarbonRecordReader.java:79)
>         at
> org.apache.carbondata.spark.rdd.CarbonScanRDD.compute(
> CarbonScanRDD.scala:192)
>         at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
>         at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
>         at org.apache.spark.rdd.MapPartitionsRDD.compute(
> MapPartitionsRDD.scala:38)
>         at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
>         at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
>         at org.apache.spark.rdd.MapPartitionsRDD.compute(
> MapPartitionsRDD.scala:38)
>         at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
>         at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
>         at org.apache.spark.rdd.MapPartitionsRDD.compute(
> MapPartitionsRDD.scala:38)
>         at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
>         at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
>         at org.apache.spark.rdd.MapPartitionsRDD.compute(
> MapPartitionsRDD.scala:38)
>         at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
>         at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
>         at org.apache.spark.rdd.MapPartitionsRDD.compute(
> MapPartitionsRDD.scala:38)
>         at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
>         at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
>         at org.apache.spark.rdd.MapPartitionsRDD.compute(
> MapPartitionsRDD.scala:38)
>         at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
>         at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
>         at org.apache.spark.rdd.MapPartitionsRDD.compute(
> MapPartitionsRDD.scala:38)
>         at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
>         at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
>         at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.
> scala:66)
>         at org.apache.spark.scheduler.Task.run(Task.scala:89)
>         at org.apache.spark.executor.Executor$TaskRunner.run(
> Executor.scala:227)
>         at
> java.util.concurrent.ThreadPoolExecutor.runWorker(
> ThreadPoolExecutor.java:1142)
>         at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(
> ThreadPoolExecutor.java:617)
>         at java.lang.Thread.run(Thread.java:745)
> Caused by: java.util.concurrent.ExecutionException: java.io.IOException:
> org.apache.thrift.protocol.TProtocolException: don't know what type: 14
>         at java.util.concurrent.FutureTask.report(FutureTask.java:122)
>         at java.util.concurrent.FutureTask.get(FutureTask.java:192)
>         at
> org.apache.carbondata.core.datastore.BlockIndexStore.fillLoadedBlocks(
> BlockIndexStore.java:254)
>         ... 35 more
> Caused by: java.io.IOException:
> org.apache.thrift.protocol.TProtocolException: don't know what type: 14
>         at
> org.apache.carbondata.core.reader.ThriftReader.read(ThriftReader.java:108)
>         at
> org.apache.carbondata.core.reader.CarbonFooterReader.
> readFooter(CarbonFooterReader.java:54)
>         at
> org.apache.carbondata.core.util.DataFileFooterConverter2.
> readDataFileFooter(DataFileFooterConverter2.java:47)
>         at
> org.apache.carbondata.core.util.CarbonUtil.readMetadatFile(CarbonUtil.
> java:848)
>         at
> org.apache.carbondata.core.datastore.AbstractBlockIndexStoreCache.
> checkAndLoadTableBlocks(AbstractBlockIndexStoreCache.java:98)
>         at
> org.apache.carbondata.core.datastore.BlockIndexStore.
> loadBlock(BlockIndexStore.java:304)
>         at
> org.apache.carbondata.core.datastore.BlockIndexStore.get(
> BlockIndexStore.java:109)
>         at
> org.apache.carbondata.core.datastore.BlockIndexStore$
> BlockLoaderThread.call(BlockIndexStore.java:294)
>         at
> org.apache.carbondata.core.datastore.BlockIndexStore$
> BlockLoaderThread.call(BlockIndexStore.java:284)
>         at java.util.concurrent.FutureTask.run(FutureTask.java:266)
>         ... 3 more
> Caused by: org.apache.thrift.protocol.TProtocolException: don't know what
> type: 14
>         at
> org.apache.thrift.protocol.TCompactProtocol.getTType(
> TCompactProtocol.java:896)
>         at
> org.apache.thrift.protocol.TCompactProtocol.readListBegin(
> TCompactProtocol.java:596)
>         at org.apache.thrift.protocol.TProtocolUtil.skip(
> TProtocolUtil.java:136)
>         at org.apache.thrift.protocol.TProtocolUtil.skip(
> TProtocolUtil.java:60)
>         at
> org.apache.carbondata.format.FileFooter$FileFooterStandardScheme.read(
> FileFooter.java:1113)
>         at
> org.apache.carbondata.format.FileFooter$FileFooterStandardScheme.read(
> FileFooter.java:990)
>         at org.apache.carbondata.format.FileFooter.read(FileFooter.
> java:867)
>         at
> org.apache.carbondata.core.reader.ThriftReader.read(ThriftReader.java:106)
>         ... 12 more
>
> INFO  28-02 15:15:33,156 - Starting task 15.1 in stage 5.0 (TID 142,
> halu062, partition 15,NODE_LOCAL, 16560 bytes)
>
>
>
> --
> View this message in context: http://apache-carbondata-
> mailing-list-archive.1130556.n5.nabble.com/Block-B-tree-
> loading-failed-tp8089.html
> Sent from the Apache CarbonData Mailing List archive mailing list archive
> at Nabble.com.
>



--
Thanks & Regards,
Ravi