GitHub user ravipesala opened a pull request:
https://github.com/apache/carbondata/pull/1810 [WIP]Store carbondata locations in datamap to make the datamap retrieval faster Currently carbondata locations are getting from namenode for each query and that makes queries slower. So this PR stores the block locations while loading datamap and retrieves from it. Be sure to do all of the following checklist to help us incorporate your contribution quickly and easily: - [X] Any interfaces changed? - [X] Any backward compatibility impacted? - [X] Document update required? - [X] Testing done - [X] For large changes, please consider breaking it into sub-tasks under an umbrella JIRA. You can merge this pull request into a Git repository by running: $ git pull https://github.com/ravipesala/incubator-carbondata datamap-perf Alternatively you can review and apply these changes as the patch at: https://github.com/apache/carbondata/pull/1810.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #1810 ---- commit 7b2ce4c58e688f6115bdbc382a1f95a9a3004f5f Author: ravipesala <ravi.pesala@...> Date: 2018-01-16T08:07:50Z Store carbondata locations in datamap to make the datamap retrieval faster. ---- --- |
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1810 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/2822/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1810 Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/1595/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1810 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/2830/ --- |
In reply to this post by qiuchenjian-2
Github user manishgupta88 commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1810#discussion_r161739311 --- Diff: core/src/main/java/org/apache/carbondata/core/indexstore/blockletindex/BlockletDataMap.java --- @@ -604,18 +731,26 @@ private ExtendedBlocklet createBlocklet(DataMapRow row, int blockletId) { detailInfo.setBlockletId((short) blockletId); detailInfo.setDimLens(columnCardinality); detailInfo.setSchemaUpdatedTimeStamp(row.getLong(SCHEMA_UPADATED_TIME_INDEX)); - BlockletInfo blockletInfo = new BlockletInfo(); - try { - byte[] byteArray = row.getByteArray(BLOCK_INFO_INDEX); - ByteArrayInputStream stream = new ByteArrayInputStream(byteArray); - DataInputStream inputStream = new DataInputStream(stream); - blockletInfo.readFields(inputStream); - inputStream.close(); - } catch (IOException e) { - throw new RuntimeException(e); + byte[] byteArray = row.getByteArray(BLOCK_INFO_INDEX); + BlockletInfo blockletInfo = null; + if (byteArray.length > 0) { + try { + blockletInfo = new BlockletInfo(); + ByteArrayInputStream stream = new ByteArrayInputStream(byteArray); + DataInputStream inputStream = new DataInputStream(stream); + blockletInfo.readFields(inputStream); + inputStream.close(); + blocklet.setLocation( --- End diff -- I think this should be outside if block because even if blocklet info does not exist in case of old store still we need to set the file location --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/1810 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/2914/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1810 Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/1605/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1810 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/2840/ --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/1810 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/2919/ --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1810#discussion_r161820659 --- Diff: core/src/main/java/org/apache/carbondata/core/indexstore/blockletindex/BlockletDataMap.java --- @@ -604,18 +731,26 @@ private ExtendedBlocklet createBlocklet(DataMapRow row, int blockletId) { detailInfo.setBlockletId((short) blockletId); detailInfo.setDimLens(columnCardinality); detailInfo.setSchemaUpdatedTimeStamp(row.getLong(SCHEMA_UPADATED_TIME_INDEX)); - BlockletInfo blockletInfo = new BlockletInfo(); - try { - byte[] byteArray = row.getByteArray(BLOCK_INFO_INDEX); - ByteArrayInputStream stream = new ByteArrayInputStream(byteArray); - DataInputStream inputStream = new DataInputStream(stream); - blockletInfo.readFields(inputStream); - inputStream.close(); - } catch (IOException e) { - throw new RuntimeException(e); + byte[] byteArray = row.getByteArray(BLOCK_INFO_INDEX); + BlockletInfo blockletInfo = null; + if (byteArray.length > 0) { + try { + blockletInfo = new BlockletInfo(); + ByteArrayInputStream stream = new ByteArrayInputStream(byteArray); + DataInputStream inputStream = new DataInputStream(stream); + blockletInfo.readFields(inputStream); + inputStream.close(); + blocklet.setLocation( --- End diff -- ok --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1810 Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/1619/ --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/1810 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/2925/ --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/1810 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/2926/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1810 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/2860/ --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/1810 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/2942/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1810 Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/1644/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1810 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/2877/ --- |
In reply to this post by qiuchenjian-2
Github user manishgupta88 commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1810#discussion_r162253805 --- Diff: core/src/main/java/org/apache/carbondata/core/indexstore/blockletindex/BlockletDataMap.java --- @@ -598,16 +627,16 @@ public boolean isScanRequired(FilterResolverIntf filterExp) { if (filterExp == null) { int rowCount = unsafeMemoryDMStore.getRowCount(); for (int i = 0; i < rowCount; i++) { - DataMapRow unsafeRow = unsafeMemoryDMStore.getUnsafeRow(i); - blocklets.add(createBlocklet(unsafeRow, i)); + DataMapRow safeRow = unsafeMemoryDMStore.getUnsafeRow(i).convertToSafeRow(); + blocklets.add(createBlocklet(safeRow, i)); } } else { int startIndex = findStartIndex(convertToRow(searchStartKey), comparator); int endIndex = findEndIndex(convertToRow(searchEndKey), comparator); FilterExecuter filterExecuter = FilterUtil.getFilterExecuterTree(filterExp, segmentProperties, null); while (startIndex <= endIndex) { - DataMapRow unsafeRow = unsafeMemoryDMStore.getUnsafeRow(startIndex); + DataMapRow unsafeRow = unsafeMemoryDMStore.getUnsafeRow(startIndex).convertToSafeRow(); --- End diff -- Please rename this variable also to safeRow --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1810#discussion_r162288811 --- Diff: core/src/main/java/org/apache/carbondata/core/indexstore/blockletindex/BlockletDataMap.java --- @@ -598,16 +627,16 @@ public boolean isScanRequired(FilterResolverIntf filterExp) { if (filterExp == null) { int rowCount = unsafeMemoryDMStore.getRowCount(); for (int i = 0; i < rowCount; i++) { - DataMapRow unsafeRow = unsafeMemoryDMStore.getUnsafeRow(i); - blocklets.add(createBlocklet(unsafeRow, i)); + DataMapRow safeRow = unsafeMemoryDMStore.getUnsafeRow(i).convertToSafeRow(); + blocklets.add(createBlocklet(safeRow, i)); } } else { int startIndex = findStartIndex(convertToRow(searchStartKey), comparator); int endIndex = findEndIndex(convertToRow(searchEndKey), comparator); FilterExecuter filterExecuter = FilterUtil.getFilterExecuterTree(filterExp, segmentProperties, null); while (startIndex <= endIndex) { - DataMapRow unsafeRow = unsafeMemoryDMStore.getUnsafeRow(startIndex); + DataMapRow unsafeRow = unsafeMemoryDMStore.getUnsafeRow(startIndex).convertToSafeRow(); --- End diff -- ok --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/1810 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/2969/ --- |
Free forum by Nabble | Edit this page |