Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1471 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/1152/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1471 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/1155/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1471 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/1156/ --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/1471 SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/1733/ --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/1471 retest this please --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1471 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/1159/ --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/1471 retest this please --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1471 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/1162/ --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/1471 retest sdv please --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/1471 SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/1743/ --- |
In reply to this post by qiuchenjian-2
Github user chenliang613 commented on the issue:
https://github.com/apache/carbondata/pull/1471 retest this please --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1471 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/1200/ --- |
In reply to this post by qiuchenjian-2
Github user chenliang613 commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1471#discussion_r151595971 --- Diff: core/src/main/java/org/apache/carbondata/core/datamap/DataMapMeta.java --- @@ -19,15 +19,15 @@ import java.util.List; -import org.apache.carbondata.core.indexstore.schema.FilterType; +import org.apache.carbondata.core.scan.filter.intf.ExpressionType; public class DataMapMeta { private List<String> indexedColumns; - private FilterType optimizedOperation; + private List<ExpressionType> optimizedOperation; --- End diff -- in ExpressionType, no "like" expression. --- |
In reply to this post by qiuchenjian-2
Github user chenliang613 commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1471#discussion_r151598459 --- Diff: core/src/main/java/org/apache/carbondata/core/datamap/dev/AbstractDataMapWriter.java --- @@ -0,0 +1,110 @@ +/* --- End diff -- Can you explain , why change "DataMapWriter.java" to "AbstractDataMapWriter.java", for easier supporting uses to customize other type of datamapwriter? --- |
In reply to this post by qiuchenjian-2
Github user jackylk commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1471#discussion_r151615263 --- Diff: processing/src/main/java/org/apache/carbondata/processing/store/writer/AbstractFactDataWriter.java --- @@ -574,7 +482,9 @@ private CopyThread(String fileName) { * @throws Exception if unable to compute a result */ @Override public Void call() throws Exception { - copyCarbonDataFileToCarbonStorePath(fileName); + CarbonUtil.copyCarbonDataFileToCarbonStorePath(fileName, --- End diff -- move parameter to next line --- |
In reply to this post by qiuchenjian-2
Github user jackylk commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1471#discussion_r151615573 --- Diff: hadoop/src/main/java/org/apache/carbondata/hadoop/api/CarbonTableInputFormat.java --- @@ -755,7 +758,8 @@ private CarbonInputSplit convertToCarbonInputSplit(ExtendedBlocklet blocklet) org.apache.carbondata.hadoop.CarbonInputSplit.from(blocklet.getSegmentId(), new FileSplit(new Path(blocklet.getPath()), 0, blocklet.getLength(), blocklet.getLocations()), - ColumnarFormatVersion.valueOf((short) blocklet.getDetailInfo().getVersionNumber())); + ColumnarFormatVersion.valueOf((short) blocklet.getDetailInfo().getVersionNumber()), + blocklet.getDataMapWriterPath()); --- End diff -- indentation not correct --- |
In reply to this post by qiuchenjian-2
Github user jackylk commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1471#discussion_r151615805 --- Diff: core/src/main/java/org/apache/carbondata/core/datamap/dev/AbstractDataMapWriter.java --- @@ -0,0 +1,110 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.carbondata.core.datamap.dev; + +import java.io.IOException; + +import org.apache.carbondata.core.datastore.impl.FileFactory; +import org.apache.carbondata.core.datastore.page.ColumnPage; +import org.apache.carbondata.core.metadata.AbsoluteTableIdentifier; +import org.apache.carbondata.core.util.CarbonUtil; +import org.apache.carbondata.core.util.path.CarbonTablePath; + +/** + * Data Map writer + */ +public abstract class AbstractDataMapWriter { + + protected AbsoluteTableIdentifier identifier; + + protected String segmentId; + + protected String writeDirectoryPath; + + public AbstractDataMapWriter(AbsoluteTableIdentifier identifier, String segmentId, + String writeDirectoryPath) { + this.identifier = identifier; + this.segmentId = segmentId; + this.writeDirectoryPath = writeDirectoryPath; + } + + /** + * Start of new block notification. + * + * @param blockId file name of the carbondata file + */ + public abstract void onBlockStart(String blockId); + + /** + * End of block notification + */ + public abstract void onBlockEnd(String blockId); + + /** + * Start of new blocklet notification. + * + * @param blockletId sequence number of blocklet in the block + */ + public abstract void onBlockletStart(int blockletId); + + /** + * End of blocklet notification + * + * @param blockletId sequence number of blocklet in the block + */ + public abstract void onBlockletEnd(int blockletId); + + /** + * Add the column pages row to the datamap, order of pages is same as `indexColumns` in + * DataMapMeta returned in DataMapFactory. + * Implementation should copy the content of `pages` as needed, because `pages` memory + * may be freed after this method returns, if using unsafe column page. + */ + public abstract void onPageAdded(int blockletId, int pageId, ColumnPage[] pages); + + /** + * This is called during closing of writer.So after this call no more data will be sent to this + * class. + */ + public abstract void finish(); + + /** + * It copies the file from temp folder to actual folder + * + * @param dataMapFile + * @throws IOException + */ + protected void commitFile(String dataMapFile) throws IOException { --- End diff -- What if anything failed inside this function, who will catch IOException and handle it? --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1471#discussion_r151636955 --- Diff: core/src/main/java/org/apache/carbondata/core/datamap/DataMapMeta.java --- @@ -19,15 +19,15 @@ import java.util.List; -import org.apache.carbondata.core.indexstore.schema.FilterType; +import org.apache.carbondata.core.scan.filter.intf.ExpressionType; public class DataMapMeta { private List<String> indexedColumns; - private FilterType optimizedOperation; + private List<ExpressionType> optimizedOperation; --- End diff -- Currently, the like expression is converted to greater than and less than equal to filter. so there is no LIKE expression in the types. --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1471#discussion_r151637482 --- Diff: core/src/main/java/org/apache/carbondata/core/datamap/dev/AbstractDataMapWriter.java --- @@ -0,0 +1,110 @@ +/* --- End diff -- Changed to Abstract class to enforce the user to pass the needed parameters through the constructor. And also the concrete method `commitFile` is added to this class. --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1471#discussion_r151637852 --- Diff: processing/src/main/java/org/apache/carbondata/processing/store/writer/AbstractFactDataWriter.java --- @@ -574,7 +482,9 @@ private CopyThread(String fileName) { * @throws Exception if unable to compute a result */ @Override public Void call() throws Exception { - copyCarbonDataFileToCarbonStorePath(fileName); + CarbonUtil.copyCarbonDataFileToCarbonStorePath(fileName, --- End diff -- ok --- |
Free forum by Nabble | Edit this page |