[ https://issues.apache.org/jira/browse/CARBONDATA-783?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ravindra Pesala reassigned CARBONDATA-783: ------------------------------------------ Assignee: Ravindra Pesala > Loading data with Single Pass 'true' option is throwing an exception > -------------------------------------------------------------------- > > Key: CARBONDATA-783 > URL: https://issues.apache.org/jira/browse/CARBONDATA-783 > Project: CarbonData > Issue Type: Bug > Components: data-query > Affects Versions: 1.1.0-incubating > Environment: spark 2.1 > Reporter: Geetika Gupta > Assignee: Ravindra Pesala > Priority: Trivial > Attachments: 7000_UniqData.csv > > > I tried to create table using the following query: > CREATE TABLE uniq_include_dictionary (CUST_ID int,CUST_NAME String,ACTIVE_EMUI_VERSION string, DOB timestamp, DOJ timestamp, BIGINT_COLUMN1 bigint,BIGINT_COLUMN2 bigint,DECIMAL_COLUMN1 decimal(30,10), DECIMAL_COLUMN2 decimal(36,10),Double_COLUMN1 double, Double_COLUMN2 double,INTEGER_COLUMN1 int) STORED BY 'org.apache.carbondata.format' TBLPROPERTIES('DICTIONARY_INCLUDE'='CUST_ID,Double_COLUMN2,DECIMAL_COLUMN2'); > Table creation was successfull but when I tried to load data into the table It showed the following error: > ERROR 16-03 13:41:32,354 - nioEventLoopGroup-8-2 > java.lang.IndexOutOfBoundsException: readerIndex(64) + length(25) exceeds writerIndex(80): UnpooledUnsafeDirectByteBuf(ridx: 64, widx: 80, cap: 80) > at io.netty.buffer.AbstractByteBuf.checkReadableBytes0(AbstractByteBuf.java:1161) > at io.netty.buffer.AbstractByteBuf.checkReadableBytes(AbstractByteBuf.java:1155) > at io.netty.buffer.AbstractByteBuf.readBytes(AbstractByteBuf.java:694) > at io.netty.buffer.AbstractByteBuf.readBytes(AbstractByteBuf.java:702) > at org.apache.carbondata.core.dictionary.generator.key.DictionaryMessage.readData(DictionaryMessage.java:70) > at org.apache.carbondata.core.dictionary.server.DictionaryServerHandler.channelRead(DictionaryServerHandler.java:59) > at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367) > at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353) > at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346) > at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294) > at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367) > at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353) > at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911) > at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131) > at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:652) > at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:575) > at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:489) > at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:451) > at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:140) > at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144) > at java.lang.Thread.run(Thread.java:745) > ERROR 16-03 13:41:32,355 - nioEventLoopGroup-8-2 exceptionCaught > java.lang.IndexOutOfBoundsException: readerIndex(64) + length(25) exceeds writerIndex(80): UnpooledUnsafeDirectByteBuf(ridx: 64, widx: 80, cap: 80) > at io.netty.buffer.AbstractByteBuf.checkReadableBytes0(AbstractByteBuf.java:1161) > at io.netty.buffer.AbstractByteBuf.checkReadableBytes(AbstractByteBuf.java:1155) > at io.netty.buffer.AbstractByteBuf.readBytes(AbstractByteBuf.java:694) > at io.netty.buffer.AbstractByteBuf.readBytes(AbstractByteBuf.java:702) > at org.apache.carbondata.core.dictionary.generator.key.DictionaryMessage.readData(DictionaryMessage.java:70) > at org.apache.carbondata.core.dictionary.server.DictionaryServerHandler.channelRead(DictionaryServerHandler.java:59) > at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367) > at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353) > at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346) > at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294) > at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367) > at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353) > at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911) > at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131) > at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:652) > at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:575) > at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:489) > at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:451) > at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:140) > at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144) > at java.lang.Thread.run(Thread.java:745) > WARN 16-03 13:43:17,223 - Lost task 0.0 in stage 0.0 (TID 0, 192.168.2.130, executor 0): org.apache.carbondata.processing.newflow.exception.CarbonDataLoadingException: > at org.apache.carbondata.processing.newflow.sort.impl.ParallelReadMergeSorterImpl.checkError(ParallelReadMergeSorterImpl.java:164) > at org.apache.carbondata.processing.newflow.sort.impl.ParallelReadMergeSorterImpl.sort(ParallelReadMergeSorterImpl.java:117) > at org.apache.carbondata.processing.newflow.steps.SortProcessorStepImpl.execute(SortProcessorStepImpl.java:76) > at org.apache.carbondata.processing.newflow.steps.DataWriterProcessorStepImpl.execute(DataWriterProcessorStepImpl.java:92) > at org.apache.carbondata.processing.newflow.DataLoadExecutor.execute(DataLoadExecutor.java:48) > at org.apache.carbondata.spark.rdd.NewCarbonDataLoadRDD$$anon$1.<init>(NewCarbonDataLoadRDD.scala:166) > at org.apache.carbondata.spark.rdd.NewCarbonDataLoadRDD.compute(NewCarbonDataLoadRDD.scala:142) > at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323) > at org.apache.spark.rdd.RDD.iterator(RDD.scala:287) > at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87) > at org.apache.spark.scheduler.Task.run(Task.scala:99) > at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:282) > at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) > at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) > at java.lang.Thread.run(Thread.java:745) > Caused by: java.lang.RuntimeException: java.lang.RuntimeException: Request timed out for key : DictionaryKey{ columnName='ACTIVE_EMUI_VERSION', data='ACTIVE_EMUI_VERSION_20014', dictionaryValue=-1, type=DICT_GENERATION} > at org.apache.carbondata.core.dictionary.client.DictionaryClientHandler.getDictionary(DictionaryClientHandler.java:99) > at org.apache.carbondata.core.dictionary.client.DictionaryClient.getDictionary(DictionaryClient.java:74) > at org.apache.carbondata.processing.newflow.dictionary.DictionaryServerClientDictionary.getOrGenerateKey(DictionaryServerClientDictionary.java:57) > at org.apache.carbondata.processing.newflow.dictionary.DictionaryServerClientDictionary.getOrGenerateKey(DictionaryServerClientDictionary.java:32) > at org.apache.carbondata.processing.newflow.converter.impl.DictionaryFieldConverterImpl.convert(DictionaryFieldConverterImpl.java:101) > at org.apache.carbondata.processing.newflow.converter.impl.RowConverterImpl.convert(RowConverterImpl.java:150) > at org.apache.carbondata.processing.newflow.steps.DataConverterProcessorStepImpl.processRowBatch(DataConverterProcessorStepImpl.java:103) > at org.apache.carbondata.processing.newflow.steps.DataConverterProcessorStepImpl$1.next(DataConverterProcessorStepImpl.java:89) > at org.apache.carbondata.processing.newflow.steps.DataConverterProcessorStepImpl$1.next(DataConverterProcessorStepImpl.java:78) > at org.apache.carbondata.processing.newflow.sort.impl.ParallelReadMergeSorterImpl$SortIteratorThread.call(ParallelReadMergeSorterImpl.java:227) > at org.apache.carbondata.processing.newflow.sort.impl.ParallelReadMergeSorterImpl$SortIteratorThread.call(ParallelReadMergeSorterImpl.java:201) > at java.util.concurrent.FutureTask.run(FutureTask.java:266) > ... 3 more > Caused by: java.lang.RuntimeException: Request timed out for key : DictionaryKey{ columnName='ACTIVE_EMUI_VERSION', data='ACTIVE_EMUI_VERSION_20014', dictionaryValue=-1, type=DICT_GENERATION} > at org.apache.carbondata.core.dictionary.client.DictionaryClientHandler.getDictionary(DictionaryClientHandler.java:94) > ... 14 more > INFO 16-03 13:43:17,242 - Starting task 0.1 in stage 0.0 (TID 1, 192.168.2.130, executor 0, partition 0, NODE_LOCAL, 6989 bytes) > AUDIT 16-03 13:43:17,375 - [knoldus-Vostro-2520][hduser][Thread-134]Connected io.netty.channel.DefaultChannelHandlerContext@6904f1c6 > AUDIT 16-03 13:43:18,471 - [knoldus-Vostro-2520][hduser][Thread-135]Connected io.netty.channel.DefaultChannelHandlerContext@38d40b06 > ERROR 16-03 13:43:19,577 - nioEventLoopGroup-8-4 > java.lang.IndexOutOfBoundsException: readerIndex(64) + length(25) exceeds writerIndex(80): UnpooledUnsafeDirectByteBuf(ridx: 64, widx: 80, cap: 80) > at io.netty.buffer.AbstractByteBuf.checkReadableBytes0(AbstractByteBuf.java:1161) > at io.netty.buffer.AbstractByteBuf.checkReadableBytes(AbstractByteBuf.java:1155) > at io.netty.buffer.AbstractByteBuf.readBytes(AbstractByteBuf.java:694) > at io.netty.buffer.AbstractByteBuf.readBytes(AbstractByteBuf.java:702) > at org.apache.carbondata.core.dictionary.generator.key.DictionaryMessage.readData(DictionaryMessage.java:70) > at org.apache.carbondata.core.dictionary.server.DictionaryServerHandler.channelRead(DictionaryServerHandler.java:59) > at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367) > at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353) > at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346) > at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294) > at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367) > at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353) > at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911) > at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131) > at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:652) > at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:575) > at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:489) > at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:451) > at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:140) > at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144) > at java.lang.Thread.run(Thread.java:745) > ERROR 16-03 13:43:19,578 - nioEventLoopGroup-8-4 exceptionCaught > java.lang.IndexOutOfBoundsException: readerIndex(64) + length(25) exceeds writerIndex(80): UnpooledUnsafeDirectByteBuf(ridx: 64, widx: 80, cap: 80) > at io.netty.buffer.AbstractByteBuf.checkReadableBytes0(AbstractByteBuf.java:1161) > at io.netty.buffer.AbstractByteBuf.checkReadableBytes(AbstractByteBuf.java:1155) > at io.netty.buffer.AbstractByteBuf.readBytes(AbstractByteBuf.java:694) > at io.netty.buffer.AbstractByteBuf.readBytes(AbstractByteBuf.java:702) > at org.apache.carbondata.core.dictionary.generator.key.DictionaryMessage.readData(DictionaryMessage.java:70) > at org.apache.carbondata.core.dictionary.server.DictionaryServerHandler.channelRead(DictionaryServerHandler.java:59) > at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367) > at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353) > at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346) > at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294) > at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367) > at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353) > at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911) > at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131) > at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:652) > at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:575) > at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:489) > at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:451) > at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:140) > at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144) > at java.lang.Thread.run(Thread.java:745) > WARN 16-03 13:45:04,012 - Lost task 0.1 in stage 0.0 (TID 1, 192.168.2.130, executor 0): org.apache.carbondata.processing.newflow.exception.CarbonDataLoadingException: > at org.apache.carbondata.processing.newflow.sort.impl.ParallelReadMergeSorterImpl.checkError(ParallelReadMergeSorterImpl.java:164) > at org.apache.carbondata.processing.newflow.sort.impl.ParallelReadMergeSorterImpl.sort(ParallelReadMergeSorterImpl.java:117) > at org.apache.carbondata.processing.newflow.steps.SortProcessorStepImpl.execute(SortProcessorStepImpl.java:76) > at org.apache.carbondata.processing.newflow.steps.DataWriterProcessorStepImpl.execute(DataWriterProcessorStepImpl.java:92) > at org.apache.carbondata.processing.newflow.DataLoadExecutor.execute(DataLoadExecutor.java:48) > at org.apache.carbondata.spark.rdd.NewCarbonDataLoadRDD$$anon$1.<init>(NewCarbonDataLoadRDD.scala:166) > at org.apache.carbondata.spark.rdd.NewCarbonDataLoadRDD.compute(NewCarbonDataLoadRDD.scala:142) > at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323) > at org.apache.spark.rdd.RDD.iterator(RDD.scala:287) > at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87) > at org.apache.spark.scheduler.Task.run(Task.scala:99) > at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:282) > at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) > at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) > at java.lang.Thread.run(Thread.java:745) > Caused by: java.lang.RuntimeException: java.lang.RuntimeException: Request timed out for key : DictionaryKey{ columnName='ACTIVE_EMUI_VERSION', data='ACTIVE_EMUI_VERSION_20014', dictionaryValue=-1, type=DICT_GENERATION} > at org.apache.carbondata.core.dictionary.client.DictionaryClientHandler.getDictionary(DictionaryClientHandler.java:99) > at org.apache.carbondata.core.dictionary.client.DictionaryClient.getDictionary(DictionaryClient.java:74) > at org.apache.carbondata.processing.newflow.dictionary.DictionaryServerClientDictionary.getOrGenerateKey(DictionaryServerClientDictionary.java:57) > at org.apache.carbondata.processing.newflow.dictionary.DictionaryServerClientDictionary.getOrGenerateKey(DictionaryServerClientDictionary.java:32) > at org.apache.carbondata.processing.newflow.converter.impl.DictionaryFieldConverterImpl.convert(DictionaryFieldConverterImpl.java:101) > at org.apache.carbondata.processing.newflow.converter.impl.RowConverterImpl.convert(RowConverterImpl.java:150) > at org.apache.carbondata.processing.newflow.steps.DataConverterProcessorStepImpl.processRowBatch(DataConverterProcessorStepImpl.java:103) > at org.apache.carbondata.processing.newflow.steps.DataConverterProcessorStepImpl$1.next(DataConverterProcessorStepImpl.java:89) > at org.apache.carbondata.processing.newflow.steps.DataConverterProcessorStepImpl$1.next(DataConverterProcessorStepImpl.java:78) > at org.apache.carbondata.processing.newflow.sort.impl.ParallelReadMergeSorterImpl$SortIteratorThread.call(ParallelReadMergeSorterImpl.java:227) > at org.apache.carbondata.processing.newflow.sort.impl.ParallelReadMergeSorterImpl$SortIteratorThread.call(ParallelReadMergeSorterImpl.java:201) > at java.util.concurrent.FutureTask.run(FutureTask.java:266) > ... 3 more > Caused by: java.lang.RuntimeException: Request timed out for key : DictionaryKey{ columnName='ACTIVE_EMUI_VERSION', data='ACTIVE_EMUI_VERSION_20014', dictionaryValue=-1, type=DICT_GENERATION} > at org.apache.carbondata.core.dictionary.client.DictionaryClientHandler.getDictionary(DictionaryClientHandler.java:94) > ... 14 more > INFO 16-03 13:45:04,019 - Starting task 0.2 in stage 0.0 (TID 2, 192.168.2.130, executor 0, partition 0, NODE_LOCAL, 6989 bytes) > AUDIT 16-03 13:45:04,072 - [knoldus-Vostro-2520][hduser][Thread-136]Connected io.netty.channel.DefaultChannelHandlerContext@68a91dec > AUDIT 16-03 13:45:05,287 - [knoldus-Vostro-2520][hduser][Thread-137]Connected io.netty.channel.DefaultChannelHandlerContext@16aa3381 > ERROR 16-03 13:45:06,776 - nioEventLoopGroup-8-6 > java.lang.IndexOutOfBoundsException: readerIndex(64) + length(25) exceeds writerIndex(80): UnpooledUnsafeDirectByteBuf(ridx: 64, widx: 80, cap: 80) > at io.netty.buffer.AbstractByteBuf.checkReadableBytes0(AbstractByteBuf.java:1161) > at io.netty.buffer.AbstractByteBuf.checkReadableBytes(AbstractByteBuf.java:1155) > at io.netty.buffer.AbstractByteBuf.readBytes(AbstractByteBuf.java:694) > at io.netty.buffer.AbstractByteBuf.readBytes(AbstractByteBuf.java:702) > at org.apache.carbondata.core.dictionary.generator.key.DictionaryMessage.readData(DictionaryMessage.java:70) > at org.apache.carbondata.core.dictionary.server.DictionaryServerHandler.channelRead(DictionaryServerHandler.java:59) > at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367) > at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353) > at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346) > at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294) > at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367) > at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353) > at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911) > at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131) > at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:652) > at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:575) > at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:489) > at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:451) > at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:140) > at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144) > at java.lang.Thread.run(Thread.java:745) > ERROR 16-03 13:45:06,777 - nioEventLoopGroup-8-6 exceptionCaught > java.lang.IndexOutOfBoundsException: readerIndex(64) + length(25) exceeds writerIndex(80): UnpooledUnsafeDirectByteBuf(ridx: 64, widx: 80, cap: 80) > at io.netty.buffer.AbstractByteBuf.checkReadableBytes0(AbstractByteBuf.java:1161) > at io.netty.buffer.AbstractByteBuf.checkReadableBytes(AbstractByteBuf.java:1155) > at io.netty.buffer.AbstractByteBuf.readBytes(AbstractByteBuf.java:694) > at io.netty.buffer.AbstractByteBuf.readBytes(AbstractByteBuf.java:702) > at org.apache.carbondata.core.dictionary.generator.key.DictionaryMessage.readData(DictionaryMessage.java:70) > at org.apache.carbondata.core.dictionary.server.DictionaryServerHandler.channelRead(DictionaryServerHandler.java:59) > at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367) > at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353) > at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346) > at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294) > at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367) > at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353) > at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911) > at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131) > at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:652) > at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:575) > at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:489) > at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:451) > at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:140) > at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144) > at java.lang.Thread.run(Thread.java:745) > WARN 16-03 13:46:51,208 - Lost task 0.2 in stage 0.0 (TID 2, 192.168.2.130, executor 0): org.apache.carbondata.processing.newflow.exception.CarbonDataLoadingException: > at org.apache.carbondata.processing.newflow.sort.impl.ParallelReadMergeSorterImpl.checkError(ParallelReadMergeSorterImpl.java:164) > at org.apache.carbondata.processing.newflow.sort.impl.ParallelReadMergeSorterImpl.sort(ParallelReadMergeSorterImpl.java:117) > at org.apache.carbondata.processing.newflow.steps.SortProcessorStepImpl.execute(SortProcessorStepImpl.java:76) > at org.apache.carbondata.processing.newflow.steps.DataWriterProcessorStepImpl.execute(DataWriterProcessorStepImpl.java:92) > at org.apache.carbondata.processing.newflow.DataLoadExecutor.execute(DataLoadExecutor.java:48) > at org.apache.carbondata.spark.rdd.NewCarbonDataLoadRDD$$anon$1.<init>(NewCarbonDataLoadRDD.scala:166) > at org.apache.carbondata.spark.rdd.NewCarbonDataLoadRDD.compute(NewCarbonDataLoadRDD.scala:142) > at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323) > at org.apache.spark.rdd.RDD.iterator(RDD.scala:287) > at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87) > at org.apache.spark.scheduler.Task.run(Task.scala:99) > at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:282) > at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) > at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) > at java.lang.Thread.run(Thread.java:745) > Caused by: java.lang.RuntimeException: java.lang.RuntimeException: Request timed out for key : DictionaryKey{ columnName='ACTIVE_EMUI_VERSION', data='ACTIVE_EMUI_VERSION_20014', dictionaryValue=-1, type=DICT_GENERATION} > at org.apache.carbondata.core.dictionary.client.DictionaryClientHandler.getDictionary(DictionaryClientHandler.java:99) > at org.apache.carbondata.core.dictionary.client.DictionaryClient.getDictionary(DictionaryClient.java:74) > at org.apache.carbondata.processing.newflow.dictionary.DictionaryServerClientDictionary.getOrGenerateKey(DictionaryServerClientDictionary.java:57) > at org.apache.carbondata.processing.newflow.dictionary.DictionaryServerClientDictionary.getOrGenerateKey(DictionaryServerClientDictionary.java:32) > at org.apache.carbondata.processing.newflow.converter.impl.DictionaryFieldConverterImpl.convert(DictionaryFieldConverterImpl.java:101) > at org.apache.carbondata.processing.newflow.converter.impl.RowConverterImpl.convert(RowConverterImpl.java:150) > at org.apache.carbondata.processing.newflow.steps.DataConverterProcessorStepImpl.processRowBatch(DataConverterProcessorStepImpl.java:103) > at org.apache.carbondata.processing.newflow.steps.DataConverterProcessorStepImpl$1.next(DataConverterProcessorStepImpl.java:89) > at org.apache.carbondata.processing.newflow.steps.DataConverterProcessorStepImpl$1.next(DataConverterProcessorStepImpl.java:78) > at org.apache.carbondata.processing.newflow.sort.impl.ParallelReadMergeSorterImpl$SortIteratorThread.call(ParallelReadMergeSorterImpl.java:227) > at org.apache.carbondata.processing.newflow.sort.impl.ParallelReadMergeSorterImpl$SortIteratorThread.call(ParallelReadMergeSorterImpl.java:201) > at java.util.concurrent.FutureTask.run(FutureTask.java:266) > ... 3 more > Caused by: java.lang.RuntimeException: Request timed out for key : DictionaryKey{ columnName='ACTIVE_EMUI_VERSION', data='ACTIVE_EMUI_VERSION_20014', dictionaryValue=-1, type=DICT_GENERATION} > at org.apache.carbondata.core.dictionary.client.DictionaryClientHandler.getDictionary(DictionaryClientHandler.java:94) > ... 14 more > INFO 16-03 13:46:51,211 - Starting task 0.3 in stage 0.0 (TID 3, 192.168.2.130, executor 0, partition 0, NODE_LOCAL, 6989 bytes) > AUDIT 16-03 13:46:51,284 - [knoldus-Vostro-2520][hduser][Thread-138]Connected io.netty.channel.DefaultChannelHandlerContext@a31f326 > AUDIT 16-03 13:46:52,740 - [knoldus-Vostro-2520][hduser][Thread-139]Connected io.netty.channel.DefaultChannelHandlerContext@54d6fe9e > ERROR 16-03 13:46:53,851 - nioEventLoopGroup-8-8 > java.lang.IndexOutOfBoundsException: readerIndex(64) + length(25) exceeds writerIndex(80): UnpooledUnsafeDirectByteBuf(ridx: 64, widx: 80, cap: 80) > at io.netty.buffer.AbstractByteBuf.checkReadableBytes0(AbstractByteBuf.java:1161) > at io.netty.buffer.AbstractByteBuf.checkReadableBytes(AbstractByteBuf.java:1155) > at io.netty.buffer.AbstractByteBuf.readBytes(AbstractByteBuf.java:694) > at io.netty.buffer.AbstractByteBuf.readBytes(AbstractByteBuf.java:702) > at org.apache.carbondata.core.dictionary.generator.key.DictionaryMessage.readData(DictionaryMessage.java:70) > at org.apache.carbondata.core.dictionary.server.DictionaryServerHandler.channelRead(DictionaryServerHandler.java:59) > at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367) > at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353) > at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346) > at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294) > at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367) > at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353) > at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911) > at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131) > at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:652) > at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:575) > at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:489) > at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:451) > at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:140) > at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144) > at java.lang.Thread.run(Thread.java:745) > ERROR 16-03 13:46:53,852 - nioEventLoopGroup-8-8 exceptionCaught > java.lang.IndexOutOfBoundsException: readerIndex(64) + length(25) exceeds writerIndex(80): UnpooledUnsafeDirectByteBuf(ridx: 64, widx: 80, cap: 80) > at io.netty.buffer.AbstractByteBuf.checkReadableBytes0(AbstractByteBuf.java:1161) > at io.netty.buffer.AbstractByteBuf.checkReadableBytes(AbstractByteBuf.java:1155) > at io.netty.buffer.AbstractByteBuf.readBytes(AbstractByteBuf.java:694) > at io.netty.buffer.AbstractByteBuf.readBytes(AbstractByteBuf.java:702) > at org.apache.carbondata.core.dictionary.generator.key.DictionaryMessage.readData(DictionaryMessage.java:70) > at org.apache.carbondata.core.dictionary.server.DictionaryServerHandler.channelRead(DictionaryServerHandler.java:59) > at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367) > at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353) > at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346) > at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294) > at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367) > at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353) > at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911) > at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131) > at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:652) > at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:575) > at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:489) > at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:451) > at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:140) > at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144) > at java.lang.Thread.run(Thread.java:745) > WARN 16-03 13:48:38,277 - Lost task 0.3 in stage 0.0 (TID 3, 192.168.2.130, executor 0): org.apache.carbondata.processing.newflow.exception.CarbonDataLoadingException: > at org.apache.carbondata.processing.newflow.sort.impl.ParallelReadMergeSorterImpl.checkError(ParallelReadMergeSorterImpl.java:164) > at org.apache.carbondata.processing.newflow.sort.impl.ParallelReadMergeSorterImpl.sort(ParallelReadMergeSorterImpl.java:117) > at org.apache.carbondata.processing.newflow.steps.SortProcessorStepImpl.execute(SortProcessorStepImpl.java:76) > at org.apache.carbondata.processing.newflow.steps.DataWriterProcessorStepImpl.execute(DataWriterProcessorStepImpl.java:92) > at org.apache.carbondata.processing.newflow.DataLoadExecutor.execute(DataLoadExecutor.java:48) > at org.apache.carbondata.spark.rdd.NewCarbonDataLoadRDD$$anon$1.<init>(NewCarbonDataLoadRDD.scala:166) > at org.apache.carbondata.spark.rdd.NewCarbonDataLoadRDD.compute(NewCarbonDataLoadRDD.scala:142) > at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323) > at org.apache.spark.rdd.RDD.iterator(RDD.scala:287) > at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87) > at org.apache.spark.scheduler.Task.run(Task.scala:99) > at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:282) > at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) > at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) > at java.lang.Thread.run(Thread.java:745) > Caused by: java.lang.RuntimeException: java.lang.RuntimeException: Request timed out for key : DictionaryKey{ columnName='ACTIVE_EMUI_VERSION', data='ACTIVE_EMUI_VERSION_20014', dictionaryValue=-1, type=DICT_GENERATION} > at org.apache.carbondata.core.dictionary.client.DictionaryClientHandler.getDictionary(DictionaryClientHandler.java:99) > at org.apache.carbondata.core.dictionary.client.DictionaryClient.getDictionary(DictionaryClient.java:74) > at org.apache.carbondata.processing.newflow.dictionary.DictionaryServerClientDictionary.getOrGenerateKey(DictionaryServerClientDictionary.java:57) > at org.apache.carbondata.processing.newflow.dictionary.DictionaryServerClientDictionary.getOrGenerateKey(DictionaryServerClientDictionary.java:32) > at org.apache.carbondata.processing.newflow.converter.impl.DictionaryFieldConverterImpl.convert(DictionaryFieldConverterImpl.java:101) > at org.apache.carbondata.processing.newflow.converter.impl.RowConverterImpl.convert(RowConverterImpl.java:150) > at org.apache.carbondata.processing.newflow.steps.DataConverterProcessorStepImpl.processRowBatch(DataConverterProcessorStepImpl.java:103) > at org.apache.carbondata.processing.newflow.steps.DataConverterProcessorStepImpl$1.next(DataConverterProcessorStepImpl.java:89) > at org.apache.carbondata.processing.newflow.steps.DataConverterProcessorStepImpl$1.next(DataConverterProcessorStepImpl.java:78) > at org.apache.carbondata.processing.newflow.sort.impl.ParallelReadMergeSorterImpl$SortIteratorThread.call(ParallelReadMergeSorterImpl.java:227) > at org.apache.carbondata.processing.newflow.sort.impl.ParallelReadMergeSorterImpl$SortIteratorThread.call(ParallelReadMergeSorterImpl.java:201) > at java.util.concurrent.FutureTask.run(FutureTask.java:266) > ... 3 more > Caused by: java.lang.RuntimeException: Request timed out for key : DictionaryKey{ columnName='ACTIVE_EMUI_VERSION', data='ACTIVE_EMUI_VERSION_20014', dictionaryValue=-1, type=DICT_GENERATION} > at org.apache.carbondata.core.dictionary.client.DictionaryClientHandler.getDictionary(DictionaryClientHandler.java:94) > ... 14 more > ERROR 16-03 13:48:38,352 - Task 0 in stage 0.0 failed 4 times; aborting job > INFO 16-03 13:48:38,551 - Removed TaskSet 0.0, whose tasks have all completed, from pool > INFO 16-03 13:48:38,697 - Cancelling stage 0 > INFO 16-03 13:48:38,701 - ResultStage 0 (collect at CarbonDataRDDFactory.scala:653) failed in 443.412 s due to Job aborted due to stage failure: Task 0 in stage 0.0 failed 4 times, most recent failure: Lost task 0.3 in stage 0.0 (TID 3, 192.168.2.130, executor 0): org.apache.carbondata.processing.newflow.exception.CarbonDataLoadingException: > at org.apache.carbondata.processing.newflow.sort.impl.ParallelReadMergeSorterImpl.checkError(ParallelReadMergeSorterImpl.java:164) > at org.apache.carbondata.processing.newflow.sort.impl.ParallelReadMergeSorterImpl.sort(ParallelReadMergeSorterImpl.java:117) > at org.apache.carbondata.processing.newflow.steps.SortProcessorStepImpl.execute(SortProcessorStepImpl.java:76) > at org.apache.carbondata.processing.newflow.steps.DataWriterProcessorStepImpl.execute(DataWriterProcessorStepImpl.java:92) > at org.apache.carbondata.processing.newflow.DataLoadExecutor.execute(DataLoadExecutor.java:48) > at org.apache.carbondata.spark.rdd.NewCarbonDataLoadRDD$$anon$1.<init>(NewCarbonDataLoadRDD.scala:166) > at org.apache.carbondata.spark.rdd.NewCarbonDataLoadRDD.compute(NewCarbonDataLoadRDD.scala:142) > at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323) > at org.apache.spark.rdd.RDD.iterator(RDD.scala:287) > at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87) > at org.apache.spark.scheduler.Task.run(Task.scala:99) > at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:282) > at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) > at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) > at java.lang.Thread.run(Thread.java:745) > Caused by: java.lang.RuntimeException: java.lang.RuntimeException: Request timed out for key : DictionaryKey{ columnName='ACTIVE_EMUI_VERSION', data='ACTIVE_EMUI_VERSION_20014', dictionaryValue=-1, type=DICT_GENERATION} > at org.apache.carbondata.core.dictionary.client.DictionaryClientHandler.getDictionary(DictionaryClientHandler.java:99) > at org.apache.carbondata.core.dictionary.client.DictionaryClient.getDictionary(DictionaryClient.java:74) > at org.apache.carbondata.processing.newflow.dictionary.DictionaryServerClientDictionary.getOrGenerateKey(DictionaryServerClientDictionary.java:57) > at org.apache.carbondata.processing.newflow.dictionary.DictionaryServerClientDictionary.getOrGenerateKey(DictionaryServerClientDictionary.java:32) > at org.apache.carbondata.processing.newflow.converter.impl.DictionaryFieldConverterImpl.convert(DictionaryFieldConverterImpl.java:101) > at org.apache.carbondata.processing.newflow.converter.impl.RowConverterImpl.convert(RowConverterImpl.java:150) > at org.apache.carbondata.processing.newflow.steps.DataConverterProcessorStepImpl.processRowBatch(DataConverterProcessorStepImpl.java:103) > at org.apache.carbondata.processing.newflow.steps.DataConverterProcessorStepImpl$1.next(DataConverterProcessorStepImpl.java:89) > at org.apache.carbondata.processing.newflow.steps.DataConverterProcessorStepImpl$1.next(DataConverterProcessorStepImpl.java:78) > at org.apache.carbondata.processing.newflow.sort.impl.ParallelReadMergeSorterImpl$SortIteratorThread.call(ParallelReadMergeSorterImpl.java:227) > at org.apache.carbondata.processing.newflow.sort.impl.ParallelReadMergeSorterImpl$SortIteratorThread.call(ParallelReadMergeSorterImpl.java:201) > at java.util.concurrent.FutureTask.run(FutureTask.java:266) > ... 3 more > Caused by: java.lang.RuntimeException: Request timed out for key : DictionaryKey{ columnName='ACTIVE_EMUI_VERSION', data='ACTIVE_EMUI_VERSION_20014', dictionaryValue=-1, type=DICT_GENERATION} > at org.apache.carbondata.core.dictionary.client.DictionaryClientHandler.getDictionary(DictionaryClientHandler.java:94) > ... 14 more > Driver stacktrace: > INFO 16-03 13:48:38,731 - Job 1 failed: collect at CarbonDataRDDFactory.scala:653, took 443.761474 s > INFO 16-03 13:48:38,732 - pool-24-thread-5 DataLoad failure: > ERROR 16-03 13:48:38,733 - pool-24-thread-5 > org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 4 times, most recent failure: Lost task 0.3 in stage 0.0 (TID 3, 192.168.2.130, executor 0): org.apache.carbondata.processing.newflow.exception.CarbonDataLoadingException: > at org.apache.carbondata.processing.newflow.sort.impl.ParallelReadMergeSorterImpl.checkError(ParallelReadMergeSorterImpl.java:164) > at org.apache.carbondata.processing.newflow.sort.impl.ParallelReadMergeSorterImpl.sort(ParallelReadMergeSorterImpl.java:117) > at org.apache.carbondata.processing.newflow.steps.SortProcessorStepImpl.execute(SortProcessorStepImpl.java:76) > at org.apache.carbondata.processing.newflow.steps.DataWriterProcessorStepImpl.execute(DataWriterProcessorStepImpl.java:92) > at org.apache.carbondata.processing.newflow.DataLoadExecutor.execute(DataLoadExecutor.java:48) > at org.apache.carbondata.spark.rdd.NewCarbonDataLoadRDD$$anon$1.<init>(NewCarbonDataLoadRDD.scala:166) > at org.apache.carbondata.spark.rdd.NewCarbonDataLoadRDD.compute(NewCarbonDataLoadRDD.scala:142) > at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323) > at org.apache.spark.rdd.RDD.iterator(RDD.scala:287) > at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87) > at org.apache.spark.scheduler.Task.run(Task.scala:99) > at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:282) > at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) > at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) > at java.lang.Thread.run(Thread.java:745) > Caused by: java.lang.RuntimeException: java.lang.RuntimeException: Request timed out for key : DictionaryKey{ columnName='ACTIVE_EMUI_VERSION', data='ACTIVE_EMUI_VERSION_20014', dictionaryValue=-1, type=DICT_GENERATION} > at org.apache.carbondata.core.dictionary.client.DictionaryClientHandler.getDictionary(DictionaryClientHandler.java:99) > at org.apache.carbondata.core.dictionary.client.DictionaryClient.getDictionary(DictionaryClient.java:74) > at org.apache.carbondata.processing.newflow.dictionary.DictionaryServerClientDictionary.getOrGenerateKey(DictionaryServerClientDictionary.java:57) > at org.apache.carbondata.processing.newflow.dictionary.DictionaryServerClientDictionary.getOrGenerateKey(DictionaryServerClientDictionary.java:32) > at org.apache.carbondata.processing.newflow.converter.impl.DictionaryFieldConverterImpl.convert(DictionaryFieldConverterImpl.java:101) > at org.apache.carbondata.processing.newflow.converter.impl.RowConverterImpl.convert(RowConverterImpl.java:150) > at org.apache.carbondata.processing.newflow.steps.DataConverterProcessorStepImpl.processRowBatch(DataConverterProcessorStepImpl.java:103) > at org.apache.carbondata.processing.newflow.steps.DataConverterProcessorStepImpl$1.next(DataConverterProcessorStepImpl.java:89) > at org.apache.carbondata.processing.newflow.steps.DataConverterProcessorStepImpl$1.next(DataConverterProcessorStepImpl.java:78) > at org.apache.carbondata.processing.newflow.sort.impl.ParallelReadMergeSorterImpl$SortIteratorThread.call(ParallelReadMergeSorterImpl.java:227) > at org.apache.carbondata.processing.newflow.sort.impl.ParallelReadMergeSorterImpl$SortIteratorThread.call(ParallelReadMergeSorterImpl.java:201) > at java.util.concurrent.FutureTask.run(FutureTask.java:266) > ... 3 more > Caused by: java.lang.RuntimeException: Request timed out for key : DictionaryKey{ columnName='ACTIVE_EMUI_VERSION', data='ACTIVE_EMUI_VERSION_20014', dictionaryValue=-1, type=DICT_GENERATION} > at org.apache.carbondata.core.dictionary.client.DictionaryClientHandler.getDictionary(DictionaryClientHandler.java:94) > ... 14 more > Driver stacktrace: > at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1435) > at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1423) > at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1422) > at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59) > at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48) > at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1422) > at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:802) > at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:802) > at scala.Option.foreach(Option.scala:257) > at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:802) > at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1650) > at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1605) > at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1594) > at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48) > at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:628) > at org.apache.spark.SparkContext.runJob(SparkContext.scala:1918) > at org.apache.spark.SparkContext.runJob(SparkContext.scala:1931) > at org.apache.spark.SparkContext.runJob(SparkContext.scala:1944) > at org.apache.spark.SparkContext.runJob(SparkContext.scala:1958) > at org.apache.spark.rdd.RDD$$anonfun$collect$1.apply(RDD.scala:935) > at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) > at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112) > at org.apache.spark.rdd.RDD.withScope(RDD.scala:362) > at org.apache.spark.rdd.RDD.collect(RDD.scala:934) > at org.apache.carbondata.spark.rdd.CarbonDataRDDFactory$.loadDataFile$1(CarbonDataRDDFactory.scala:653) > at org.apache.carbondata.spark.rdd.CarbonDataRDDFactory$.loadCarbonData(CarbonDataRDDFactory.scala:807) > at org.apache.spark.sql.execution.command.LoadTable.run(carbonTableSchema.scala:531) > at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:58) > at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:56) > at org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:74) > at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:114) > at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:114) > at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:135) > at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) > at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:132) > at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:113) > at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:87) > at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:87) > at org.apache.spark.sql.Dataset.<init>(Dataset.scala:185) > at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:64) > at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:592) > at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:699) > at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.org$apache$spark$sql$hive$thriftserver$SparkExecuteStatementOperation$$execute(SparkExecuteStatementOperation.scala:220) > at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$1$$anon$2.run(SparkExecuteStatementOperation.scala:163) > at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$1$$anon$2.run(SparkExecuteStatementOperation.scala:160) > at java.security.AccessController.doPrivileged(Native Method) > at javax.security.auth.Subject.doAs(Subject.java:422) > at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698) > at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$1.run(SparkExecuteStatementOperation.scala:173) > at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) > at java.util.concurrent.FutureTask.run(FutureTask.java:266) > at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) > at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) > at java.lang.Thread.run(Thread.java:745) > Caused by: org.apache.carbondata.processing.newflow.exception.CarbonDataLoadingException: > at org.apache.carbondata.processing.newflow.sort.impl.ParallelReadMergeSorterImpl.checkError(ParallelReadMergeSorterImpl.java:164) > at org.apache.carbondata.processing.newflow.sort.impl.ParallelReadMergeSorterImpl.sort(ParallelReadMergeSorterImpl.java:117) > at org.apache.carbondata.processing.newflow.steps.SortProcessorStepImpl.execute(SortProcessorStepImpl.java:76) > at org.apache.carbondata.processing.newflow.steps.DataWriterProcessorStepImpl.execute(DataWriterProcessorStepImpl.java:92) > at org.apache.carbondata.processing.newflow.DataLoadExecutor.execute(DataLoadExecutor.java:48) > at org.apache.carbondata.spark.rdd.NewCarbonDataLoadRDD$$anon$1.<init>(NewCarbonDataLoadRDD.scala:166) > at org.apache.carbondata.spark.rdd.NewCarbonDataLoadRDD.compute(NewCarbonDataLoadRDD.scala:142) > at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323) > at org.apache.spark.rdd.RDD.iterator(RDD.scala:287) > at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87) > at org.apache.spark.scheduler.Task.run(Task.scala:99) > at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:282) > ... 3 more > Caused by: java.lang.RuntimeException: java.lang.RuntimeException: Request timed out for key : DictionaryKey{ columnName='ACTIVE_EMUI_VERSION', data='ACTIVE_EMUI_VERSION_20014', dictionaryValue=-1, type=DICT_GENERATION} > at org.apache.carbondata.core.dictionary.client.DictionaryClientHandler.getDictionary(DictionaryClientHandler.java:99) > at org.apache.carbondata.core.dictionary.client.DictionaryClient.getDictionary(DictionaryClient.java:74) > at org.apache.carbondata.processing.newflow.dictionary.DictionaryServerClientDictionary.getOrGenerateKey(DictionaryServerClientDictionary.java:57) > at org.apache.carbondata.processing.newflow.dictionary.DictionaryServerClientDictionary.getOrGenerateKey(DictionaryServerClientDictionary.java:32) > at org.apache.carbondata.processing.newflow.converter.impl.DictionaryFieldConverterImpl.convert(DictionaryFieldConverterImpl.java:101) > at org.apache.carbondata.processing.newflow.converter.impl.RowConverterImpl.convert(RowConverterImpl.java:150) > at org.apache.carbondata.processing.newflow.steps.DataConverterProcessorStepImpl.processRowBatch(DataConverterProcessorStepImpl.java:103) > at org.apache.carbondata.processing.newflow.steps.DataConverterProcessorStepImpl$1.next(DataConverterProcessorStepImpl.java:89) > at org.apache.carbondata.processing.newflow.steps.DataConverterProcessorStepImpl$1.next(DataConverterProcessorStepImpl.java:78) > at org.apache.carbondata.processing.newflow.sort.impl.ParallelReadMergeSorterImpl$SortIteratorThread.call(ParallelReadMergeSorterImpl.java:227) > at org.apache.carbondata.processing.newflow.sort.impl.ParallelReadMergeSorterImpl$SortIteratorThread.call(ParallelReadMergeSorterImpl.java:201) > at java.util.concurrent.FutureTask.run(FutureTask.java:266) > ... 3 more > Caused by: java.lang.RuntimeException: Request timed out for key : DictionaryKey{ columnName='ACTIVE_EMUI_VERSION', data='ACTIVE_EMUI_VERSION_20014', dictionaryValue=-1, type=DICT_GENERATION} > at org.apache.carbondata.core.dictionary.client.DictionaryClientHandler.getDictionary(DictionaryClientHandler.java:94) > ... 14 more > INFO 16-03 13:48:38,734 - pool-24-thread-5 ********starting clean up********** > INFO 16-03 13:48:39,103 - pool-24-thread-5 ********clean up done********** > AUDIT 16-03 13:48:39,104 - [knoldus-Vostro-2520][hduser][Thread-122]Data load is failed for default.uniq_include_dictionary > WARN 16-03 13:48:39,104 - pool-24-thread-5 Cannot write load metadata file as data load failed > ERROR 16-03 13:48:41,311 - pool-24-thread-5 > java.lang.Exception: DataLoad failure: > at org.apache.carbondata.spark.rdd.CarbonDataRDDFactory$.loadCarbonData(CarbonDataRDDFactory.scala:950) > at org.apache.spark.sql.execution.command.LoadTable.run(carbonTableSchema.scala:531) > at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:58) > at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:56) > at org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:74) > at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:114) > at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:114) > at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:135) > at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) > at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:132) > at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:113) > at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:87) > at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:87) > at org.apache.spark.sql.Dataset.<init>(Dataset.scala:185) > at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:64) > at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:592) > at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:699) > at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.org$apache$spark$sql$hive$thriftserver$SparkExecuteStatementOperation$$execute(SparkExecuteStatementOperation.scala:220) > at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$1$$anon$2.run(SparkExecuteStatementOperation.scala:163) > at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$1$$anon$2.run(SparkExecuteStatementOperation.scala:160) > at java.security.AccessController.doPrivileged(Native Method) > at javax.security.auth.Subject.doAs(Subject.java:422) > at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698) > at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$1.run(SparkExecuteStatementOperation.scala:173) > at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) > at java.util.concurrent.FutureTask.run(FutureTask.java:266) > at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) > at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) > at java.lang.Thread.run(Thread.java:745) > AUDIT 16-03 13:48:41,312 - [knoldus-Vostro-2520][hduser][Thread-122]Dataload failure for default.uniq_include_dictionary. Please check the logs > INFO 16-03 13:48:41,471 - pool-24-thread-5 Successfully deleted the lock file /tmp/default/uniq_include_dictionary/meta.lock > INFO 16-03 13:48:41,471 - pool-24-thread-5 Table MetaData Unlocked Successfully after data load > ERROR 16-03 13:48:41,471 - Error executing query, currentState RUNNING, > java.lang.Exception: DataLoad failure: > at org.apache.carbondata.spark.rdd.CarbonDataRDDFactory$.loadCarbonData(CarbonDataRDDFactory.scala:950) > at org.apache.spark.sql.execution.command.LoadTable.run(carbonTableSchema.scala:531) > at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:58) > at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:56) > at org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:74) > at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:114) > at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:114) > at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:135) > at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) > at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:132) > at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:113) > at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:87) > at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:87) > at org.apache.spark.sql.Dataset.<init>(Dataset.scala:185) > at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:64) > at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:592) > at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:699) > at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.org$apache$spark$sql$hive$thriftserver$SparkExecuteStatementOperation$$execute(SparkExecuteStatementOperation.scala:220) > at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$1$$anon$2.run(SparkExecuteStatementOperation.scala:163) > at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$1$$anon$2.run(SparkExecuteStatementOperation.scala:160) > at java.security.AccessController.doPrivileged(Native Method) > at javax.security.auth.Subject.doAs(Subject.java:422) > at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698) > at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$1.run(SparkExecuteStatementOperation.scala:173) > at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) > at java.util.concurrent.FutureTask.run(FutureTask.java:266) > at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) > at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) > at java.lang.Thread.run(Thread.java:745) > ERROR 16-03 13:48:41,472 - Error running hive query: > org.apache.hive.service.cli.HiveSQLException: java.lang.Exception: DataLoad failure: > at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.org$apache$spark$sql$hive$thriftserver$SparkExecuteStatementOperation$$execute(SparkExecuteStatementOperation.scala:258) > at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$1$$anon$2.run(SparkExecuteStatementOperation.scala:163) > at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$1$$anon$2.run(SparkExecuteStatementOperation.scala:160) > at java.security.AccessController.doPrivileged(Native Method) > at javax.security.auth.Subject.doAs(Subject.java:422) > at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698) > at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$1.run(SparkExecuteStatementOperation.scala:173) > at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) > at java.util.concurrent.FutureTask.run(FutureTask.java:266) > at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) > at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) > at java.lang.Thread.run(Thread.java:745) > Here is the load command that I used : > LOAD DATA INPATH 'hdfs://localhost:54311/BabuStore/DATA/uniqdata/7000_UniqData.csv' into table uniq_include_dictionary OPTIONS('DELIMITER'=',' , 'QUOTECHAR'='"','BAD_RECORDS_LOGGER_ENABLE'='TRUE', 'BAD_RECORDS_ACTION'='FORCE','FILEHEADER'='CUST_ID,CUST_NAME,ACTIVE_EMUI_VERSION,DOB,DOJ,BIGINT_COLUMN1,BIGINT_COLUMN2,DECIMAL_COLUMN1,DECIMAL_COLUMN2,Double_COLUMN1,Double_COLUMN2,INTEGER_COLUMN1','SINGLE_Pass'='true'); > PFA the csv for input data -- This message was sent by Atlassian JIRA (v6.3.15#6346) |
Free forum by Nabble | Edit this page |