Posted by
xm_zzc on
Oct 16, 2018; 8:00am
URL: http://apache-carbondata-dev-mailing-list-archive.168.s1.nabble.com/java-lang-NegativeArraySizeException-occurred-when-compact-tp65023p65053.html
Hi Babu:
Thanks for your reply.
I set enable.unsafe.in.query.processing=false and
enable.unsafe.columnpage=false , and test failed still.
As the test steps I mentioned above, I copy the wrong
segment and use SDKReader to read data, it failed too, the error message is
following:
*java.lang.RuntimeException: java.lang.IllegalArgumentException
at
org.apache.carbondata.core.datastore.chunk.impl.DimensionRawColumnChunk.convertToDimColDataChunkWithOutCache(DimensionRawColumnChunk.java:120)
at
org.apache.carbondata.core.scan.result.BlockletScannedResult.fillDataChunks(BlockletScannedResult.java:355)
at
org.apache.carbondata.core.scan.result.BlockletScannedResult.hasNext(BlockletScannedResult.java:559)
at
org.apache.carbondata.core.scan.collector.impl.DictionaryBasedResultCollector.collectResultInRow(DictionaryBasedResultCollector.java:137)
at
org.apache.carbondata.core.scan.processor.DataBlockIterator.next(DataBlockIterator.java:109)
at
org.apache.carbondata.core.scan.result.iterator.DetailQueryResultIterator.getBatchResult(DetailQueryResultIterator.java:49)
at
org.apache.carbondata.core.scan.result.iterator.DetailQueryResultIterator.next(DetailQueryResultIterator.java:41)
at
org.apache.carbondata.core.scan.result.iterator.DetailQueryResultIterator.next(DetailQueryResultIterator.java:1)
at
org.apache.carbondata.core.scan.result.iterator.ChunkRowIterator.hasNext(ChunkRowIterator.java:58)
at
org.apache.carbondata.hadoop.CarbonRecordReader.nextKeyValue(CarbonRecordReader.java:104)
at
org.apache.carbondata.sdk.file.CarbonReader.hasNext(CarbonReader.java:71)
at cn.xm.zzc.carbonsdktest.CarbonSDKTest.main(CarbonSDKTest.java:68)
Caused by: java.lang.IllegalArgumentException
at java.nio.Buffer.position(Buffer.java:244)
at
org.apache.carbondata.core.datastore.chunk.store.impl.unsafe.UnsafeVariableLengthDimensionDataChunkStore.putArray(UnsafeVariableLengthDimensionDataChunkStore.java:97)
at
org.apache.carbondata.core.datastore.chunk.impl.VariableLengthDimensionColumnPage.<init>(VariableLengthDimensionColumnPage.java:58)
at
org.apache.carbondata.core.datastore.chunk.reader.dimension.v3.CompressedDimensionChunkFileBasedReaderV3.decodeDimensionLegacy(CompressedDimensionChunkFileBasedReaderV3.java:325)
at
org.apache.carbondata.core.datastore.chunk.reader.dimension.v3.CompressedDimensionChunkFileBasedReaderV3.decodeDimension(CompressedDimensionChunkFileBasedReaderV3.java:266)
at
org.apache.carbondata.core.datastore.chunk.reader.dimension.v3.CompressedDimensionChunkFileBasedReaderV3.decodeColumnPage(CompressedDimensionChunkFileBasedReaderV3.java:224)
at
org.apache.carbondata.core.datastore.chunk.impl.DimensionRawColumnChunk.convertToDimColDataChunkWithOutCache(DimensionRawColumnChunk.java:118)
... 11 more*
There are many error records, when error occurred, the values of some parameters in
UnsafeVariableLengthDimensionDataChunkStore.putArray are as following :
buffer.limit buffer.cap startOffset lastLength numberOfRows this.dataPointersOffsets
288000 288000 300289 24433 32000 288000
448000 448000 464551 24927 32000 448000
384000 384000 -32566 -32568 32000 384000
480000 480000 -20257 -20259 32000 480000
96000 96000 96166 304 32000 96000
515278 515278 -12047 -12049 32000 515278
305953 305953 -8148 -8150 32000 305953
field 'lastLength' means the length get from data before error occurred.
startOffset is bigger than buffer.limit, so error occurred.
I think maybe the data are written wrongly when generated this segment because of the issue you fixed: MemoryBlock is cleaned by some other thread and result in wrong data when write data, is it possible?
--
Sent from:
http://apache-carbondata-dev-mailing-list-archive.1130556.n5.nabble.com/