GitHub user sounakr opened a pull request:
https://github.com/apache/carbondata/pull/1686 [CARBONDATA-1914] [WIP] Cache Access Be sure to do all of the following checklist to help us incorporate your contribution quickly and easily: - [ ] Any interfaces changed? - [ ] Any backward compatibility impacted? - [ ] Document update required? - [ ] Testing done Please provide details on - Whether new unit test cases have been added or why no new tests are required? - How it is tested? Please attach test report. - Is it a performance related change? Please attach the performance test report. - Any additional information to help reviewers in testing this change. - [ ] For large changes, please consider breaking it into sub-tasks under an umbrella JIRA. You can merge this pull request into a Git repository by running: $ git pull https://github.com/sounakr/incubator-carbondata dictionary_cache Alternatively you can review and apply these changes as the patch at: https://github.com/apache/carbondata/pull/1686.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #1686 ---- commit 8d1fd018a7429d4dd52bc1da8e1b4b60b3f59256 Author: sounakr <sounakr@...> Date: 2017-12-20T03:46:22Z Cache Access ---- --- |
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/1686 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/2442/ --- |
In reply to this post by qiuchenjian-2
Github user sounakr commented on the issue:
https://github.com/apache/carbondata/pull/1686 Retest this please. --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1686 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/2221/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1686 Build Success with Spark 2.2.0, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/999/ --- |
In reply to this post by qiuchenjian-2
Github user gvramana commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1686#discussion_r158283837 --- Diff: examples/spark2/src/main/scala/org/apache/carbondata/examples/ExampleUtils.scala --- @@ -43,6 +43,8 @@ object ExampleUtils { .addProperty(CarbonCommonConstants.CARBON_DATE_FORMAT, "yyyy/MM/dd") .addProperty(CarbonCommonConstants.ENABLE_UNSAFE_COLUMN_PAGE_LOADING, "true") .addProperty(CarbonCommonConstants.CARBON_BADRECORDS_LOC, "") + .addProperty(CarbonCommonConstants.CARBON_MAX_DRIVER_LRU_CACHE_SIZE, "1") + .addProperty(CarbonCommonConstants.CARBON_MAX_EXECUTOR_LRU_CACHE_SIZE, "1") --- End diff -- Use only for test --- |
In reply to this post by qiuchenjian-2
Github user gvramana commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1686#discussion_r158285314 --- Diff: integration/spark2/src/test/scala/org/apache/carbondata/spark/util/DictionaryLRUCacheTestCase.scala --- @@ -0,0 +1,424 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.carbondata.spark.util + +import java.io.File + +import scala.collection.JavaConverters._ + +import org.apache.spark.sql.common.util.Spark2QueryTest +import org.apache.spark.sql.hive.CarbonRelation +import org.apache.spark.sql.test.Spark2TestQueryExecutor +import org.apache.spark.sql.{CarbonEnv, SparkSession} +import org.scalatest.BeforeAndAfterAll + +import org.apache.carbondata.core.cache.{Cache, CacheProvider, CacheType} +import org.apache.carbondata.core.cache.dictionary.{Dictionary, DictionaryColumnUniqueIdentifier} +import org.apache.carbondata.core.constants.CarbonCommonConstants +import org.apache.carbondata.core.util.{CarbonProperties, CarbonUtil} + +/** + * Test Case for org.apache.carbondata.integration.spark.util.GlobalDictionaryUtil + */ +class DictionaryLRUCacheTestCase extends Spark2QueryTest with BeforeAndAfterAll { + var spark : SparkSession = null + var path : String = null + + def createCarbonSession(appName: String): SparkSession = { + val rootPath = new File(this.getClass.getResource("/").getPath + + "../../../..").getCanonicalPath + val storeLocation = s"$rootPath/examples/spark2/target/store" + val warehouse = s"$rootPath/examples/spark2/target/warehouse" + + CarbonProperties.getInstance() + .addProperty(CarbonCommonConstants.CARBON_TIMESTAMP_FORMAT, "yyyy/MM/dd HH:mm:ss") + .addProperty(CarbonCommonConstants.CARBON_DATE_FORMAT, "yyyy/MM/dd") + .addProperty(CarbonCommonConstants.ENABLE_UNSAFE_COLUMN_PAGE_LOADING, "true") + .addProperty(CarbonCommonConstants.CARBON_BADRECORDS_LOC, "") --- End diff -- These values should be reset after failure/execution --- |
In reply to this post by qiuchenjian-2
Github user gvramana commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1686#discussion_r158285554 --- Diff: integration/spark2/src/test/scala/org/apache/carbondata/spark/util/DictionaryLRUCacheTestCase.scala --- @@ -0,0 +1,424 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.carbondata.spark.util + +import java.io.File + +import scala.collection.JavaConverters._ + +import org.apache.spark.sql.common.util.Spark2QueryTest +import org.apache.spark.sql.hive.CarbonRelation +import org.apache.spark.sql.test.Spark2TestQueryExecutor +import org.apache.spark.sql.{CarbonEnv, SparkSession} +import org.scalatest.BeforeAndAfterAll + +import org.apache.carbondata.core.cache.{Cache, CacheProvider, CacheType} +import org.apache.carbondata.core.cache.dictionary.{Dictionary, DictionaryColumnUniqueIdentifier} +import org.apache.carbondata.core.constants.CarbonCommonConstants +import org.apache.carbondata.core.util.{CarbonProperties, CarbonUtil} + +/** + * Test Case for org.apache.carbondata.integration.spark.util.GlobalDictionaryUtil + */ +class DictionaryLRUCacheTestCase extends Spark2QueryTest with BeforeAndAfterAll { + var spark : SparkSession = null + var path : String = null + + def createCarbonSession(appName: String): SparkSession = { + val rootPath = new File(this.getClass.getResource("/").getPath + + "../../../..").getCanonicalPath + val storeLocation = s"$rootPath/examples/spark2/target/store" + val warehouse = s"$rootPath/examples/spark2/target/warehouse" + + CarbonProperties.getInstance() + .addProperty(CarbonCommonConstants.CARBON_TIMESTAMP_FORMAT, "yyyy/MM/dd HH:mm:ss") + .addProperty(CarbonCommonConstants.CARBON_DATE_FORMAT, "yyyy/MM/dd") + .addProperty(CarbonCommonConstants.ENABLE_UNSAFE_COLUMN_PAGE_LOADING, "true") + .addProperty(CarbonCommonConstants.CARBON_BADRECORDS_LOC, "") + .addProperty(CarbonCommonConstants.CARBON_MAX_DRIVER_LRU_CACHE_SIZE, "1") + .addProperty(CarbonCommonConstants.CARBON_MAX_EXECUTOR_LRU_CACHE_SIZE, "1") + + import org.apache.spark.sql.CarbonSession._ + spark = SparkSession --- End diff -- Don't create a new session in testcase, it will create derbyDB access problems --- |
In reply to this post by qiuchenjian-2
Github user gvramana commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1686#discussion_r158286004 --- Diff: integration/spark2/src/test/scala/org/apache/carbondata/spark/util/DictionaryLRUCacheTestCase.scala --- @@ -0,0 +1,424 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.carbondata.spark.util + +import java.io.File + +import scala.collection.JavaConverters._ + +import org.apache.spark.sql.common.util.Spark2QueryTest +import org.apache.spark.sql.hive.CarbonRelation +import org.apache.spark.sql.test.Spark2TestQueryExecutor +import org.apache.spark.sql.{CarbonEnv, SparkSession} +import org.scalatest.BeforeAndAfterAll + +import org.apache.carbondata.core.cache.{Cache, CacheProvider, CacheType} +import org.apache.carbondata.core.cache.dictionary.{Dictionary, DictionaryColumnUniqueIdentifier} +import org.apache.carbondata.core.constants.CarbonCommonConstants +import org.apache.carbondata.core.util.{CarbonProperties, CarbonUtil} + +/** + * Test Case for org.apache.carbondata.integration.spark.util.GlobalDictionaryUtil + */ +class DictionaryLRUCacheTestCase extends Spark2QueryTest with BeforeAndAfterAll { + var spark : SparkSession = null + var path : String = null + + def createCarbonSession(appName: String): SparkSession = { + val rootPath = new File(this.getClass.getResource("/").getPath + + "../../../..").getCanonicalPath + val storeLocation = s"$rootPath/examples/spark2/target/store" + val warehouse = s"$rootPath/examples/spark2/target/warehouse" + + CarbonProperties.getInstance() + .addProperty(CarbonCommonConstants.CARBON_TIMESTAMP_FORMAT, "yyyy/MM/dd HH:mm:ss") + .addProperty(CarbonCommonConstants.CARBON_DATE_FORMAT, "yyyy/MM/dd") + .addProperty(CarbonCommonConstants.ENABLE_UNSAFE_COLUMN_PAGE_LOADING, "true") + .addProperty(CarbonCommonConstants.CARBON_BADRECORDS_LOC, "") + .addProperty(CarbonCommonConstants.CARBON_MAX_DRIVER_LRU_CACHE_SIZE, "1") + .addProperty(CarbonCommonConstants.CARBON_MAX_EXECUTOR_LRU_CACHE_SIZE, "1") + + import org.apache.spark.sql.CarbonSession._ + spark = SparkSession + .builder() + .master("local") + .appName("CarbonSessionExample") + .config("spark.sql.warehouse.dir", warehouse) + .config("spark.driver.host", "localhost") + .getOrCreateCarbonSession(storeLocation) + spark.sparkContext.setLogLevel("WARN") + spark + } + + def checkDictionaryAccessCount(databaseName: String, tableName: String): Unit = { + val carbonTable = CarbonEnv.getInstance(Spark2TestQueryExecutor.spark).carbonMetastore + .lookupRelation(Option(databaseName), tableName)(Spark2TestQueryExecutor.spark) + .asInstanceOf[CarbonRelation].carbonTable + val absoluteTableIdentifier = carbonTable.getAbsoluteTableIdentifier + + val dimensions = carbonTable.getAllDimensions.asScala.toList + dimensions.foreach { dim => + val columnIdentifier = dim.getColumnIdentifier + // Check the dictionary cache access. + val identifier: DictionaryColumnUniqueIdentifier = new DictionaryColumnUniqueIdentifier( + absoluteTableIdentifier, + columnIdentifier, + columnIdentifier.getDataType) + + val isDictExists: Boolean = CarbonUtil.isFileExistsForGivenColumn(identifier) + var dictionary: Dictionary = null + if (isDictExists) { --- End diff -- If dictionary not exist assert --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1686 Build Success with Spark 2.2.0, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/1003/ --- |
In reply to this post by qiuchenjian-2
Github user sounakr commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1686#discussion_r158312303 --- Diff: examples/spark2/src/main/scala/org/apache/carbondata/examples/ExampleUtils.scala --- @@ -43,6 +43,8 @@ object ExampleUtils { .addProperty(CarbonCommonConstants.CARBON_DATE_FORMAT, "yyyy/MM/dd") .addProperty(CarbonCommonConstants.ENABLE_UNSAFE_COLUMN_PAGE_LOADING, "true") .addProperty(CarbonCommonConstants.CARBON_BADRECORDS_LOC, "") + .addProperty(CarbonCommonConstants.CARBON_MAX_DRIVER_LRU_CACHE_SIZE, "1") + .addProperty(CarbonCommonConstants.CARBON_MAX_EXECUTOR_LRU_CACHE_SIZE, "1") --- End diff -- Done --- |
In reply to this post by qiuchenjian-2
Github user sounakr commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1686#discussion_r158312327 --- Diff: integration/spark2/src/test/scala/org/apache/carbondata/spark/util/DictionaryLRUCacheTestCase.scala --- @@ -0,0 +1,424 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.carbondata.spark.util + +import java.io.File + +import scala.collection.JavaConverters._ + +import org.apache.spark.sql.common.util.Spark2QueryTest +import org.apache.spark.sql.hive.CarbonRelation +import org.apache.spark.sql.test.Spark2TestQueryExecutor +import org.apache.spark.sql.{CarbonEnv, SparkSession} +import org.scalatest.BeforeAndAfterAll + +import org.apache.carbondata.core.cache.{Cache, CacheProvider, CacheType} +import org.apache.carbondata.core.cache.dictionary.{Dictionary, DictionaryColumnUniqueIdentifier} +import org.apache.carbondata.core.constants.CarbonCommonConstants +import org.apache.carbondata.core.util.{CarbonProperties, CarbonUtil} + +/** + * Test Case for org.apache.carbondata.integration.spark.util.GlobalDictionaryUtil + */ +class DictionaryLRUCacheTestCase extends Spark2QueryTest with BeforeAndAfterAll { + var spark : SparkSession = null + var path : String = null + + def createCarbonSession(appName: String): SparkSession = { + val rootPath = new File(this.getClass.getResource("/").getPath + + "../../../..").getCanonicalPath + val storeLocation = s"$rootPath/examples/spark2/target/store" + val warehouse = s"$rootPath/examples/spark2/target/warehouse" + + CarbonProperties.getInstance() + .addProperty(CarbonCommonConstants.CARBON_TIMESTAMP_FORMAT, "yyyy/MM/dd HH:mm:ss") + .addProperty(CarbonCommonConstants.CARBON_DATE_FORMAT, "yyyy/MM/dd") + .addProperty(CarbonCommonConstants.ENABLE_UNSAFE_COLUMN_PAGE_LOADING, "true") + .addProperty(CarbonCommonConstants.CARBON_BADRECORDS_LOC, "") --- End diff -- Done --- |
In reply to this post by qiuchenjian-2
Github user sounakr commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1686#discussion_r158312372 --- Diff: integration/spark2/src/test/scala/org/apache/carbondata/spark/util/DictionaryLRUCacheTestCase.scala --- @@ -0,0 +1,424 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.carbondata.spark.util + +import java.io.File + +import scala.collection.JavaConverters._ + +import org.apache.spark.sql.common.util.Spark2QueryTest +import org.apache.spark.sql.hive.CarbonRelation +import org.apache.spark.sql.test.Spark2TestQueryExecutor +import org.apache.spark.sql.{CarbonEnv, SparkSession} +import org.scalatest.BeforeAndAfterAll + +import org.apache.carbondata.core.cache.{Cache, CacheProvider, CacheType} +import org.apache.carbondata.core.cache.dictionary.{Dictionary, DictionaryColumnUniqueIdentifier} +import org.apache.carbondata.core.constants.CarbonCommonConstants +import org.apache.carbondata.core.util.{CarbonProperties, CarbonUtil} + +/** + * Test Case for org.apache.carbondata.integration.spark.util.GlobalDictionaryUtil + */ +class DictionaryLRUCacheTestCase extends Spark2QueryTest with BeforeAndAfterAll { + var spark : SparkSession = null + var path : String = null + + def createCarbonSession(appName: String): SparkSession = { + val rootPath = new File(this.getClass.getResource("/").getPath + + "../../../..").getCanonicalPath + val storeLocation = s"$rootPath/examples/spark2/target/store" + val warehouse = s"$rootPath/examples/spark2/target/warehouse" + + CarbonProperties.getInstance() + .addProperty(CarbonCommonConstants.CARBON_TIMESTAMP_FORMAT, "yyyy/MM/dd HH:mm:ss") + .addProperty(CarbonCommonConstants.CARBON_DATE_FORMAT, "yyyy/MM/dd") + .addProperty(CarbonCommonConstants.ENABLE_UNSAFE_COLUMN_PAGE_LOADING, "true") + .addProperty(CarbonCommonConstants.CARBON_BADRECORDS_LOC, "") + .addProperty(CarbonCommonConstants.CARBON_MAX_DRIVER_LRU_CACHE_SIZE, "1") + .addProperty(CarbonCommonConstants.CARBON_MAX_EXECUTOR_LRU_CACHE_SIZE, "1") + + import org.apache.spark.sql.CarbonSession._ + spark = SparkSession --- End diff -- Done --- |
In reply to this post by qiuchenjian-2
Github user sounakr commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1686#discussion_r158312477 --- Diff: integration/spark2/src/test/scala/org/apache/carbondata/spark/util/DictionaryLRUCacheTestCase.scala --- @@ -0,0 +1,424 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.carbondata.spark.util + +import java.io.File + +import scala.collection.JavaConverters._ + +import org.apache.spark.sql.common.util.Spark2QueryTest +import org.apache.spark.sql.hive.CarbonRelation +import org.apache.spark.sql.test.Spark2TestQueryExecutor +import org.apache.spark.sql.{CarbonEnv, SparkSession} +import org.scalatest.BeforeAndAfterAll + +import org.apache.carbondata.core.cache.{Cache, CacheProvider, CacheType} +import org.apache.carbondata.core.cache.dictionary.{Dictionary, DictionaryColumnUniqueIdentifier} +import org.apache.carbondata.core.constants.CarbonCommonConstants +import org.apache.carbondata.core.util.{CarbonProperties, CarbonUtil} + +/** + * Test Case for org.apache.carbondata.integration.spark.util.GlobalDictionaryUtil + */ +class DictionaryLRUCacheTestCase extends Spark2QueryTest with BeforeAndAfterAll { + var spark : SparkSession = null + var path : String = null + + def createCarbonSession(appName: String): SparkSession = { + val rootPath = new File(this.getClass.getResource("/").getPath + + "../../../..").getCanonicalPath + val storeLocation = s"$rootPath/examples/spark2/target/store" + val warehouse = s"$rootPath/examples/spark2/target/warehouse" + + CarbonProperties.getInstance() + .addProperty(CarbonCommonConstants.CARBON_TIMESTAMP_FORMAT, "yyyy/MM/dd HH:mm:ss") + .addProperty(CarbonCommonConstants.CARBON_DATE_FORMAT, "yyyy/MM/dd") + .addProperty(CarbonCommonConstants.ENABLE_UNSAFE_COLUMN_PAGE_LOADING, "true") + .addProperty(CarbonCommonConstants.CARBON_BADRECORDS_LOC, "") + .addProperty(CarbonCommonConstants.CARBON_MAX_DRIVER_LRU_CACHE_SIZE, "1") + .addProperty(CarbonCommonConstants.CARBON_MAX_EXECUTOR_LRU_CACHE_SIZE, "1") + + import org.apache.spark.sql.CarbonSession._ + spark = SparkSession + .builder() + .master("local") + .appName("CarbonSessionExample") + .config("spark.sql.warehouse.dir", warehouse) + .config("spark.driver.host", "localhost") + .getOrCreateCarbonSession(storeLocation) + spark.sparkContext.setLogLevel("WARN") + spark + } + + def checkDictionaryAccessCount(databaseName: String, tableName: String): Unit = { + val carbonTable = CarbonEnv.getInstance(Spark2TestQueryExecutor.spark).carbonMetastore + .lookupRelation(Option(databaseName), tableName)(Spark2TestQueryExecutor.spark) + .asInstanceOf[CarbonRelation].carbonTable + val absoluteTableIdentifier = carbonTable.getAbsoluteTableIdentifier + + val dimensions = carbonTable.getAllDimensions.asScala.toList + dimensions.foreach { dim => + val columnIdentifier = dim.getColumnIdentifier + // Check the dictionary cache access. + val identifier: DictionaryColumnUniqueIdentifier = new DictionaryColumnUniqueIdentifier( + absoluteTableIdentifier, + columnIdentifier, + columnIdentifier.getDataType) + + val isDictExists: Boolean = CarbonUtil.isFileExistsForGivenColumn(identifier) + var dictionary: Dictionary = null + if (isDictExists) { --- End diff -- All the colulmns are not Dictionary. --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1686 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/2226/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1686 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/2230/ --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/1686 SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/2485/ --- |
In reply to this post by qiuchenjian-2
Github user sounakr commented on the issue:
https://github.com/apache/carbondata/pull/1686 Retest this please --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1686 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/2231/ --- |
In reply to this post by qiuchenjian-2
Github user gvramana commented on the issue:
https://github.com/apache/carbondata/pull/1686 LGTM --- |
Free forum by Nabble | Edit this page |