Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1398 Build Success with Spark 1.6, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/232/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1398 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/356/ --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/1398 SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/984/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1398 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/357/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1398 Build Success with Spark 1.6, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/233/ --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/1398 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/985/ --- |
In reply to this post by qiuchenjian-2
Github user jackylk commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1398#discussion_r143612778 --- Diff: core/src/main/java/org/apache/carbondata/core/metadata/blocklet/BlockletInfo.java --- @@ -214,6 +218,42 @@ public void setNumberOfPages(int numberOfPages) { for (int i = 0; i < mSize; i++) { output.writeInt(measureChunksLength.get(i)); } + // Serialize datachunks as well for older versions like V1 and V2 --- End diff -- Can you wrap this logic into a function and mentioning it is for V1 and V2 serialization only, I think it will be more readable --- |
In reply to this post by qiuchenjian-2
Github user jackylk commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1398#discussion_r143612831 --- Diff: core/src/main/java/org/apache/carbondata/core/metadata/blocklet/BlockletInfo.java --- @@ -238,6 +278,20 @@ public void setNumberOfPages(int numberOfPages) { for (int i = 0; i < measureChunkOffsetsSize; i++) { measureChunksLength.add(input.readInt()); } - + // Deserialize datachunks as well for older versions like V1 and V2 --- End diff -- Can you wrap this logic into a function and mentioning it is for V1 and V2 deserialization only, I think it will be more readable --- |
In reply to this post by qiuchenjian-2
Github user jackylk commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1398#discussion_r143612901 --- Diff: core/src/main/java/org/apache/carbondata/core/metadata/datatype/DataType.java --- @@ -34,11 +34,13 @@ STRUCT(10, "STRUCT", -1), MAP(11, "MAP", -1), BYTE(12, "BYTE", 1), - // internal use only, for variable length data type BYTE_ARRAY(13, "BYTE_ARRAY", -1), + --- End diff -- remove empty line --- |
In reply to this post by qiuchenjian-2
Github user jackylk commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1398#discussion_r143612947 --- Diff: core/src/main/java/org/apache/carbondata/core/metadata/datatype/DataType.java --- @@ -34,11 +34,13 @@ STRUCT(10, "STRUCT", -1), MAP(11, "MAP", -1), BYTE(12, "BYTE", 1), - // internal use only, for variable length data type BYTE_ARRAY(13, "BYTE_ARRAY", -1), + // internal use only, for value compression from integer/long to 3 bytes value - SHORT_INT(14, "SHORT_INT", 3); + SHORT_INT(14, "SHORT_INT", 3), + // Only for internal use for backward compatability --- End diff -- mentioning it is for V1 and V2 format only --- |
In reply to this post by qiuchenjian-2
Github user jackylk commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1398#discussion_r143613282 --- Diff: core/src/main/java/org/apache/carbondata/core/util/CarbonUtil.java --- @@ -2070,6 +1994,20 @@ public static void dropDatabaseDirectory(String dbName, String storePath) } } + public static DataType getDataType(char type) { --- End diff -- Suggest to move all conversion to DataType enum, including this one and `ColumnPageEncoderMeta.convertType`. And also move `CarbonCommonConstants.BIG_INT_MEASURE` and related constants to DataType enum also. --- |
In reply to this post by qiuchenjian-2
Github user jackylk commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1398#discussion_r143613491 --- Diff: core/src/main/java/org/apache/carbondata/core/util/DataFileFooterConverter.java --- @@ -123,6 +123,25 @@ private BlockletInfo getBlockletInfo( } @Override public List<ColumnSchema> getSchema(TableBlockInfo tableBlockInfo) throws IOException { - return null; + FileHolder fileReader = null; --- End diff -- Is this change related to this PR? Can you add comment to describe the intention of this function in its abstract interface? --- |
In reply to this post by qiuchenjian-2
Github user jackylk commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1398#discussion_r143613673 --- Diff: integration/spark-common-cluster-test/src/test/scala/org/apache/carbondata/cluster/sdv/generated/CarbonV1toV3CompatabilityTestCase.scala --- @@ -0,0 +1,87 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package org.apache.carbondata.cluster.sdv.generated + +import org.apache.spark.sql.common.util.QueryTest +import org.apache.spark.sql.hive.CarbonSessionState +import org.apache.spark.sql.test.TestQueryExecutor +import org.apache.spark.sql.{CarbonEnv, CarbonSession, Row, SparkSession} +import org.scalatest.BeforeAndAfterAll + +import org.apache.carbondata.core.constants.CarbonCommonConstants +import org.apache.carbondata.core.util.CarbonProperties + +/** + * V1 to V3 compatability test. This test has to be at last --- End diff -- Will you add V2 to V3 compatibility test in future PR? --- |
In reply to this post by qiuchenjian-2
Github user jackylk commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1398#discussion_r143613824 --- Diff: integration/spark2/src/main/scala/org/apache/spark/sql/CarbonSession.scala --- @@ -43,7 +43,7 @@ class CarbonSession(@transient val sc: SparkContext, } @transient - override private[sql] lazy val sessionState: SessionState = new CarbonSessionState(this) + override lazy val sessionState: SessionState = new CarbonSessionState(this) --- End diff -- Is this change required? Somewhere outside sql package will read this variable? --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1398#discussion_r143715237 --- Diff: core/src/main/java/org/apache/carbondata/core/metadata/blocklet/BlockletInfo.java --- @@ -214,6 +218,42 @@ public void setNumberOfPages(int numberOfPages) { for (int i = 0; i < mSize; i++) { output.writeInt(measureChunksLength.get(i)); } + // Serialize datachunks as well for older versions like V1 and V2 --- End diff -- ok --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1398#discussion_r143715262 --- Diff: core/src/main/java/org/apache/carbondata/core/metadata/blocklet/BlockletInfo.java --- @@ -238,6 +278,20 @@ public void setNumberOfPages(int numberOfPages) { for (int i = 0; i < measureChunkOffsetsSize; i++) { measureChunksLength.add(input.readInt()); } - + // Deserialize datachunks as well for older versions like V1 and V2 --- End diff -- ok --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1398#discussion_r143715741 --- Diff: core/src/main/java/org/apache/carbondata/core/metadata/datatype/DataType.java --- @@ -34,11 +34,13 @@ STRUCT(10, "STRUCT", -1), MAP(11, "MAP", -1), BYTE(12, "BYTE", 1), - // internal use only, for variable length data type BYTE_ARRAY(13, "BYTE_ARRAY", -1), + --- End diff -- ok --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1398#discussion_r143715769 --- Diff: core/src/main/java/org/apache/carbondata/core/metadata/datatype/DataType.java --- @@ -34,11 +34,13 @@ STRUCT(10, "STRUCT", -1), MAP(11, "MAP", -1), BYTE(12, "BYTE", 1), - // internal use only, for variable length data type BYTE_ARRAY(13, "BYTE_ARRAY", -1), + // internal use only, for value compression from integer/long to 3 bytes value - SHORT_INT(14, "SHORT_INT", 3); + SHORT_INT(14, "SHORT_INT", 3), + // Only for internal use for backward compatability --- End diff -- ok --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1398#discussion_r143815676 --- Diff: core/src/main/java/org/apache/carbondata/core/util/CarbonUtil.java --- @@ -2070,6 +1994,20 @@ public static void dropDatabaseDirectory(String dbName, String storePath) } } + public static DataType getDataType(char type) { --- End diff -- ok --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1398#discussion_r143815996 --- Diff: core/src/main/java/org/apache/carbondata/core/util/DataFileFooterConverter.java --- @@ -123,6 +123,25 @@ private BlockletInfo getBlockletInfo( } @Override public List<ColumnSchema> getSchema(TableBlockInfo tableBlockInfo) throws IOException { - return null; + FileHolder fileReader = null; --- End diff -- yes, it is part of this PR. We need to get columnschema --- |
Free forum by Nabble | Edit this page |