Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2134 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/4773/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2134 Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/3554/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2134 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/4780/ --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/2134 SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/4282/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2134 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/4783/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2134 Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/3558/ --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/2134 SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/4286/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2134 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/4812/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2134 Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/3588/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2134 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/4815/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2134 Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/3591/ --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2134#discussion_r179339971 --- Diff: core/src/main/java/org/apache/carbondata/core/datamap/dev/DataMap.java --- @@ -67,4 +68,14 @@ */ void clear(); + /** + * This method will be required for dataMaps that require 2 stage of construction. + * Ideal scenario will be first stage contains all the processing logic and second + * stage includes updating to database. + * Method usage can differ based on scenario and implementation + * + * @throws MemoryException + */ + void commit() throws MemoryException, IOException; --- End diff -- I think it does not make sense to have this method in interface level. Caching should only be restricted to DataMapFactory not to the level of DataMap. --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2134#discussion_r179340305 --- Diff: core/src/main/java/org/apache/carbondata/core/datamap/dev/DataMapFactory.java --- @@ -62,6 +63,12 @@ */ void fireEvent(Event event); + /** + * Add the dataMap to cache + * @param dataMap + */ + void addDataMapToCache(DataMap dataMap) throws IOException, MemoryException; --- End diff -- Better add these methods in another interface CacheableDataMap and implement it to BlockletDataMapFactory directly. ``` CacheableDataMap { void cache(List<DataMap> dataMaps) List<DataMapDistributable> getAllUncachedDataMaps() } ``` --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2134#discussion_r179341377 --- Diff: core/src/main/java/org/apache/carbondata/core/indexstore/UnsafeMemoryDMStore.java --- @@ -52,11 +62,23 @@ private final long taskId = ThreadLocalTaskInfo.getCarbonTaskInfo().getTaskId(); - public UnsafeMemoryDMStore(CarbonRowSchema[] schema) throws MemoryException { + private boolean addRowToUnsafe = true; --- End diff -- Class name itself `UnsafeMemoryDMStore` so it does not make sense to have this variable. Please create AbstractClass and give two implementations with safe and Unsafe. --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2134#discussion_r179341666 --- Diff: core/src/main/java/org/apache/carbondata/core/indexstore/blockletindex/BlockletDataMap.java --- @@ -186,23 +193,28 @@ public void init(DataMapModel dataMapModel) throws IOException, MemoryException } } } - if (unsafeMemoryDMStore != null) { - unsafeMemoryDMStore.finishWriting(); - } if (null != unsafeMemorySummaryDMStore) { addTaskSummaryRowToUnsafeMemoryStore( summaryRow, schemaBinary, filePath, fileName, segmentId); - unsafeMemorySummaryDMStore.finishWriting(); } LOGGER.info( "Time taken to load blocklet datamap from file : " + dataMapModel.getFilePath() + "is " + ( System.currentTimeMillis() - startTime)); } + @Override public void commit() throws MemoryException, IOException { --- End diff -- This is specific to BlockletDataMap so lets only keep this class not at interface level. And also I am not sure why we should separate a commit method. Is there any way to avoid this method? --- |
In reply to this post by qiuchenjian-2
Github user dhatchayani commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2134#discussion_r179416485 --- Diff: core/src/main/java/org/apache/carbondata/core/indexstore/UnsafeMemoryDMStore.java --- @@ -52,11 +62,23 @@ private final long taskId = ThreadLocalTaskInfo.getCarbonTaskInfo().getTaskId(); - public UnsafeMemoryDMStore(CarbonRowSchema[] schema) throws MemoryException { + private boolean addRowToUnsafe = true; --- End diff -- This variable is to decide whether to write row by row to unsafe or to collect all rows and then write all to unsafe. --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2134 Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/3613/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2134 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/4836/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2134 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/4840/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2134 Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/3618/ --- |
Free forum by Nabble | Edit this page |