[GitHub] carbondata pull request #2300: [CARBONDATA-2459][DataMap] Add cache for bloo...

classic Classic list List threaded Threaded
30 messages Options
12
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata pull request #2300: [CARBONDATA-2459][DataMap] Add cache for bloo...

qiuchenjian-2
GitHub user xuchuanyin opened a pull request:

    https://github.com/apache/carbondata/pull/2300

    [CARBONDATA-2459][DataMap] Add cache for bloom filter datamap

    Loading bloom filter from bloomindex file is slow. Adding cache for this procedure will surely improve the query performance. In this commit, the cache is implemented using guava cache instead of cache implemented by carbon. The reason is that there will be a cyclic dependency between module carbon-core and carbon-bloom if we use carbon cache in cabon-bloom.
   
    Be sure to do all of the following checklist to help us incorporate
    your contribution quickly and easily:
   
     - [x] Any interfaces changed?
     `NO`
     - [x] Any backward compatibility impacted?
     `NO`
     - [x] Document update required?
    `NO`
     - [x] Testing done
            Please provide details on
            - Whether new unit test cases have been added or why no new tests are required?
    `No`
            - How it is tested? Please attach test report.
    `Tested in 3-node cluster.`
            - Is it a performance related change? Please attach the performance test report.
    `Query performance is enhanced`
            - Any additional information to help reviewers in testing this change.
           `NO`
     - [x] For large changes, please consider breaking it into sub-tasks under an umbrella JIRA.
    `NA`


You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/xuchuanyin/carbondata 0511_cache_bloom_datamap

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/carbondata/pull/2300.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #2300
   
----
commit 0ecdcf909ac8d7fa2c59042130d554913124b9d1
Author: xuchuanyin <xuchuanyin@...>
Date:   2018-05-11T13:49:43Z

    Add cache for bloom filter datamap
   
    Loading bloom filter from bloomindex file is slow. Adding cache for this procedure will surely improve the query performance. In this commit, the cache is implemented using guava cache instead of cache implemented by carbon. The reason is that there will be a cyclic dependency between module carbon-core and carbon-bloom if we use carbon cache in cabon-bloom.

----


---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata issue #2300: [CARBONDATA-2459][DataMap] Add cache for bloom filte...

qiuchenjian-2
Github user CarbonDataQA commented on the issue:

    https://github.com/apache/carbondata/pull/2300
 
    Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/4689/



---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata issue #2300: [CARBONDATA-2459][DataMap] Add cache for bloom filte...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:

    https://github.com/apache/carbondata/pull/2300
 
    Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/5845/



---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata issue #2300: [CARBONDATA-2459][DataMap] Add cache for bloom filte...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:

    https://github.com/apache/carbondata/pull/2300
 
    SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/4891/



---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata pull request #2300: [CARBONDATA-2459][DataMap] Add cache for bloo...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user jackylk commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/2300#discussion_r187762575
 
    --- Diff: datamap/bloom/src/main/java/org/apache/carbondata/datamap/bloom/BloomDataMapCache.java ---
    @@ -0,0 +1,197 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *    http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +package org.apache.carbondata.datamap.bloom;
    +
    +import java.io.DataInputStream;
    +import java.io.EOFException;
    +import java.io.File;
    +import java.io.IOException;
    +import java.io.ObjectInputStream;
    +import java.io.Serializable;
    +import java.util.ArrayList;
    +import java.util.List;
    +import java.util.Objects;
    +import java.util.concurrent.TimeUnit;
    +
    +import org.apache.carbondata.common.logging.LogService;
    +import org.apache.carbondata.common.logging.LogServiceFactory;
    +import org.apache.carbondata.core.datastore.impl.FileFactory;
    +import org.apache.carbondata.core.util.CarbonUtil;
    +
    +import com.google.common.cache.CacheBuilder;
    +import com.google.common.cache.CacheLoader;
    +import com.google.common.cache.CacheStats;
    +import com.google.common.cache.LoadingCache;
    +import com.google.common.cache.RemovalListener;
    +import com.google.common.cache.RemovalNotification;
    +
    +public class BloomDataMapCache implements Serializable {
    --- End diff --
   
    I think public is not required. And please add @InterfaceAudience


---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata pull request #2300: [CARBONDATA-2459][DataMap] Add cache for bloo...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user jackylk commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/2300#discussion_r187762649
 
    --- Diff: datamap/bloom/src/main/java/org/apache/carbondata/datamap/bloom/BloomDataMapCache.java ---
    @@ -0,0 +1,197 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *    http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +package org.apache.carbondata.datamap.bloom;
    +
    +import java.io.DataInputStream;
    +import java.io.EOFException;
    +import java.io.File;
    +import java.io.IOException;
    +import java.io.ObjectInputStream;
    +import java.io.Serializable;
    +import java.util.ArrayList;
    +import java.util.List;
    +import java.util.Objects;
    +import java.util.concurrent.TimeUnit;
    +
    +import org.apache.carbondata.common.logging.LogService;
    +import org.apache.carbondata.common.logging.LogServiceFactory;
    +import org.apache.carbondata.core.datastore.impl.FileFactory;
    +import org.apache.carbondata.core.util.CarbonUtil;
    +
    +import com.google.common.cache.CacheBuilder;
    +import com.google.common.cache.CacheLoader;
    +import com.google.common.cache.CacheStats;
    +import com.google.common.cache.LoadingCache;
    +import com.google.common.cache.RemovalListener;
    +import com.google.common.cache.RemovalNotification;
    +
    +public class BloomDataMapCache implements Serializable {
    +  private static final LogService LOGGER = LogServiceFactory.getLogService(
    +      BloomDataMapCache.class.getName());
    +  private static final long serialVersionUID = 20160822L;
    +  private static final long DEFAULT_CACHE_SIZE = 512 * 1024 * 1024;
    +  private static final int DEFAULT_CACHE_EXPIRED_HOURS = 2;
    +  private LoadingCache<CacheKey, List<BloomDMModel>> bloomDMCache = null;
    +
    +  private BloomDataMapCache() {
    +    RemovalListener<CacheKey, List<BloomDMModel>> listener =
    +        new RemovalListener<CacheKey, List<BloomDMModel>>() {
    +      @Override
    +      public void onRemoval(RemovalNotification<CacheKey, List<BloomDMModel>> notification) {
    +        LOGGER.error(
    +            String.format("Remove bloom datamap entry %s from cache due to %s",
    +                notification.getKey(), notification.getCause()));
    +      }
    +    };
    +    CacheLoader<CacheKey, List<BloomDMModel>> cacheLoader =
    +        new CacheLoader<CacheKey, List<BloomDMModel>>() {
    +      @Override
    +      public List<BloomDMModel> load(CacheKey key) throws Exception {
    +        LOGGER.error(String.format("Load bloom datamap entry %s to cache", key));
    +        return loadBloomDataMapModel(key);
    +      }
    +    };
    +
    +    this.bloomDMCache = CacheBuilder.newBuilder()
    +        .recordStats()
    +        .maximumSize(DEFAULT_CACHE_SIZE)
    +        .expireAfterAccess(DEFAULT_CACHE_EXPIRED_HOURS, TimeUnit.HOURS)
    +        .removalListener(listener)
    +        .build(cacheLoader);
    +  }
    +
    +  private static class SingletonHolder {
    +    public static final BloomDataMapCache INSTANCE = new BloomDataMapCache();
    +  }
    +
    +  /**
    +   * get instance
    +   */
    +  public static BloomDataMapCache getInstance() {
    +    return SingletonHolder.INSTANCE;
    +  }
    +
    +  /**
    +   * for resolve from serialized
    +   */
    +  protected Object readResolve() {
    +    return getInstance();
    +  }
    +
    +  /**
    +   * load datamap from bloomindex file
    +   */
    +  private List<BloomDMModel> loadBloomDataMapModel(CacheKey cacheKey) {
    +    DataInputStream dataInStream = null;
    +    ObjectInputStream objectInStream = null;
    +    List<BloomDMModel> bloomDMModels = new ArrayList<BloomDMModel>();
    +    try {
    +      String indexFile = getIndexFileFromCacheKey(cacheKey);
    +      dataInStream = FileFactory.getDataInputStream(indexFile, FileFactory.getFileType(indexFile));
    +      objectInStream = new ObjectInputStream(dataInStream);
    +      try {
    +        BloomDMModel model = null;
    +        while ((model = (BloomDMModel) objectInStream.readObject()) != null) {
    +          bloomDMModels.add(model);
    +        }
    +      } catch (EOFException e) {
    +        LOGGER.info(String.format("Read %d bloom indices from %s",
    +            bloomDMModels.size(), indexFile));
    +      }
    +      this.bloomDMCache.put(cacheKey, bloomDMModels);
    +      return bloomDMModels;
    +    } catch (ClassNotFoundException | IOException e) {
    +      LOGGER.error(e, "Error occurs while reading bloom index");
    +      throw new RuntimeException("Error occurs while reading bloom index", e);
    +    } finally {
    +      clear();
    +      CarbonUtil.closeStreams(objectInStream, dataInStream);
    +    }
    +  }
    +
    +  /**
    +   * get bloom index file name from cachekey
    +   */
    +  private String getIndexFileFromCacheKey(CacheKey cacheKey) {
    +    return cacheKey.shardPath.concat(File.separator).concat(cacheKey.indexColumn)
    +        .concat(BloomCoarseGrainDataMap.BLOOM_INDEX_SUFFIX);
    +  }
    +
    +  /**
    +   * get bloom datamap from cache
    +   */
    +  public List<BloomDMModel> getBloomDMModelByKey(CacheKey cacheKey) {
    +    return this.bloomDMCache.getUnchecked(cacheKey);
    +  }
    +
    +  /**
    +   * get cache status
    +   */
    +  private String getCacheStatus() {
    +    StringBuilder sb = new StringBuilder();
    +    CacheStats stats = this.bloomDMCache.stats();
    +    sb.append("hitCount: ").append(stats.hitCount()).append(System.lineSeparator())
    +        .append("hitRate: ").append(stats.hitCount()).append(System.lineSeparator())
    +        .append("loadCount: ").append(stats.loadCount()).append(System.lineSeparator())
    +        .append("averageLoadPenalty: ").append(stats.averageLoadPenalty())
    +        .append(System.lineSeparator())
    +        .append("evictionCount: ").append(stats.evictionCount());
    +    return sb.toString();
    +  }
    +
    +  /**
    +   * clear this cache
    +   */
    +  private void clear() {
    +    LOGGER.error(String.format("Current meta cache statistic: %s", getCacheStatus()));
    +    LOGGER.error("Trigger invalid all the cache for bloom datamap");
    +    this.bloomDMCache.invalidateAll();
    +  }
    +
    +  public static class CacheKey {
    --- End diff --
   
    public is not required


---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata pull request #2300: [CARBONDATA-2459][DataMap] Add cache for bloo...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user jackylk commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/2300#discussion_r187762650
 
    --- Diff: datamap/bloom/src/main/java/org/apache/carbondata/datamap/bloom/BloomDataMapCache.java ---
    @@ -0,0 +1,197 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *    http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +package org.apache.carbondata.datamap.bloom;
    +
    +import java.io.DataInputStream;
    +import java.io.EOFException;
    +import java.io.File;
    +import java.io.IOException;
    +import java.io.ObjectInputStream;
    +import java.io.Serializable;
    +import java.util.ArrayList;
    +import java.util.List;
    +import java.util.Objects;
    +import java.util.concurrent.TimeUnit;
    +
    +import org.apache.carbondata.common.logging.LogService;
    +import org.apache.carbondata.common.logging.LogServiceFactory;
    +import org.apache.carbondata.core.datastore.impl.FileFactory;
    +import org.apache.carbondata.core.util.CarbonUtil;
    +
    +import com.google.common.cache.CacheBuilder;
    +import com.google.common.cache.CacheLoader;
    +import com.google.common.cache.CacheStats;
    +import com.google.common.cache.LoadingCache;
    +import com.google.common.cache.RemovalListener;
    +import com.google.common.cache.RemovalNotification;
    +
    +public class BloomDataMapCache implements Serializable {
    +  private static final LogService LOGGER = LogServiceFactory.getLogService(
    +      BloomDataMapCache.class.getName());
    +  private static final long serialVersionUID = 20160822L;
    +  private static final long DEFAULT_CACHE_SIZE = 512 * 1024 * 1024;
    +  private static final int DEFAULT_CACHE_EXPIRED_HOURS = 2;
    +  private LoadingCache<CacheKey, List<BloomDMModel>> bloomDMCache = null;
    +
    +  private BloomDataMapCache() {
    +    RemovalListener<CacheKey, List<BloomDMModel>> listener =
    +        new RemovalListener<CacheKey, List<BloomDMModel>>() {
    +      @Override
    +      public void onRemoval(RemovalNotification<CacheKey, List<BloomDMModel>> notification) {
    +        LOGGER.error(
    +            String.format("Remove bloom datamap entry %s from cache due to %s",
    +                notification.getKey(), notification.getCause()));
    +      }
    +    };
    +    CacheLoader<CacheKey, List<BloomDMModel>> cacheLoader =
    +        new CacheLoader<CacheKey, List<BloomDMModel>>() {
    +      @Override
    +      public List<BloomDMModel> load(CacheKey key) throws Exception {
    +        LOGGER.error(String.format("Load bloom datamap entry %s to cache", key));
    +        return loadBloomDataMapModel(key);
    +      }
    +    };
    +
    +    this.bloomDMCache = CacheBuilder.newBuilder()
    +        .recordStats()
    +        .maximumSize(DEFAULT_CACHE_SIZE)
    +        .expireAfterAccess(DEFAULT_CACHE_EXPIRED_HOURS, TimeUnit.HOURS)
    +        .removalListener(listener)
    +        .build(cacheLoader);
    +  }
    +
    +  private static class SingletonHolder {
    +    public static final BloomDataMapCache INSTANCE = new BloomDataMapCache();
    +  }
    +
    +  /**
    +   * get instance
    +   */
    +  public static BloomDataMapCache getInstance() {
    +    return SingletonHolder.INSTANCE;
    +  }
    +
    +  /**
    +   * for resolve from serialized
    +   */
    +  protected Object readResolve() {
    +    return getInstance();
    +  }
    +
    +  /**
    +   * load datamap from bloomindex file
    +   */
    +  private List<BloomDMModel> loadBloomDataMapModel(CacheKey cacheKey) {
    +    DataInputStream dataInStream = null;
    +    ObjectInputStream objectInStream = null;
    +    List<BloomDMModel> bloomDMModels = new ArrayList<BloomDMModel>();
    +    try {
    +      String indexFile = getIndexFileFromCacheKey(cacheKey);
    +      dataInStream = FileFactory.getDataInputStream(indexFile, FileFactory.getFileType(indexFile));
    +      objectInStream = new ObjectInputStream(dataInStream);
    +      try {
    +        BloomDMModel model = null;
    +        while ((model = (BloomDMModel) objectInStream.readObject()) != null) {
    +          bloomDMModels.add(model);
    +        }
    +      } catch (EOFException e) {
    +        LOGGER.info(String.format("Read %d bloom indices from %s",
    +            bloomDMModels.size(), indexFile));
    +      }
    +      this.bloomDMCache.put(cacheKey, bloomDMModels);
    +      return bloomDMModels;
    +    } catch (ClassNotFoundException | IOException e) {
    +      LOGGER.error(e, "Error occurs while reading bloom index");
    +      throw new RuntimeException("Error occurs while reading bloom index", e);
    +    } finally {
    +      clear();
    +      CarbonUtil.closeStreams(objectInStream, dataInStream);
    +    }
    +  }
    +
    +  /**
    +   * get bloom index file name from cachekey
    +   */
    +  private String getIndexFileFromCacheKey(CacheKey cacheKey) {
    +    return cacheKey.shardPath.concat(File.separator).concat(cacheKey.indexColumn)
    +        .concat(BloomCoarseGrainDataMap.BLOOM_INDEX_SUFFIX);
    +  }
    +
    +  /**
    +   * get bloom datamap from cache
    +   */
    +  public List<BloomDMModel> getBloomDMModelByKey(CacheKey cacheKey) {
    +    return this.bloomDMCache.getUnchecked(cacheKey);
    +  }
    +
    +  /**
    +   * get cache status
    +   */
    +  private String getCacheStatus() {
    +    StringBuilder sb = new StringBuilder();
    +    CacheStats stats = this.bloomDMCache.stats();
    +    sb.append("hitCount: ").append(stats.hitCount()).append(System.lineSeparator())
    +        .append("hitRate: ").append(stats.hitCount()).append(System.lineSeparator())
    +        .append("loadCount: ").append(stats.loadCount()).append(System.lineSeparator())
    +        .append("averageLoadPenalty: ").append(stats.averageLoadPenalty())
    +        .append(System.lineSeparator())
    +        .append("evictionCount: ").append(stats.evictionCount());
    +    return sb.toString();
    +  }
    +
    +  /**
    +   * clear this cache
    +   */
    +  private void clear() {
    +    LOGGER.error(String.format("Current meta cache statistic: %s", getCacheStatus()));
    +    LOGGER.error("Trigger invalid all the cache for bloom datamap");
    +    this.bloomDMCache.invalidateAll();
    +  }
    +
    +  public static class CacheKey {
    +    private String shardPath;
    +    private String indexColumn;
    +
    +    public CacheKey(String shardPath, String indexColumn) {
    --- End diff --
   
    public is not required


---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata pull request #2300: [CARBONDATA-2459][DataMap] Add cache for bloo...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user jackylk commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/2300#discussion_r187762679
 
    --- Diff: datamap/bloom/src/main/java/org/apache/carbondata/datamap/bloom/BloomDataMapCache.java ---
    @@ -0,0 +1,197 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *    http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +package org.apache.carbondata.datamap.bloom;
    +
    +import java.io.DataInputStream;
    +import java.io.EOFException;
    +import java.io.File;
    +import java.io.IOException;
    +import java.io.ObjectInputStream;
    +import java.io.Serializable;
    +import java.util.ArrayList;
    +import java.util.List;
    +import java.util.Objects;
    +import java.util.concurrent.TimeUnit;
    +
    +import org.apache.carbondata.common.logging.LogService;
    +import org.apache.carbondata.common.logging.LogServiceFactory;
    +import org.apache.carbondata.core.datastore.impl.FileFactory;
    +import org.apache.carbondata.core.util.CarbonUtil;
    +
    +import com.google.common.cache.CacheBuilder;
    +import com.google.common.cache.CacheLoader;
    +import com.google.common.cache.CacheStats;
    +import com.google.common.cache.LoadingCache;
    +import com.google.common.cache.RemovalListener;
    +import com.google.common.cache.RemovalNotification;
    +
    +public class BloomDataMapCache implements Serializable {
    +  private static final LogService LOGGER = LogServiceFactory.getLogService(
    +      BloomDataMapCache.class.getName());
    +  private static final long serialVersionUID = 20160822L;
    +  private static final long DEFAULT_CACHE_SIZE = 512 * 1024 * 1024;
    +  private static final int DEFAULT_CACHE_EXPIRED_HOURS = 2;
    +  private LoadingCache<CacheKey, List<BloomDMModel>> bloomDMCache = null;
    +
    +  private BloomDataMapCache() {
    +    RemovalListener<CacheKey, List<BloomDMModel>> listener =
    +        new RemovalListener<CacheKey, List<BloomDMModel>>() {
    +      @Override
    +      public void onRemoval(RemovalNotification<CacheKey, List<BloomDMModel>> notification) {
    +        LOGGER.error(
    +            String.format("Remove bloom datamap entry %s from cache due to %s",
    +                notification.getKey(), notification.getCause()));
    +      }
    +    };
    +    CacheLoader<CacheKey, List<BloomDMModel>> cacheLoader =
    +        new CacheLoader<CacheKey, List<BloomDMModel>>() {
    +      @Override
    +      public List<BloomDMModel> load(CacheKey key) throws Exception {
    +        LOGGER.error(String.format("Load bloom datamap entry %s to cache", key));
    +        return loadBloomDataMapModel(key);
    +      }
    +    };
    +
    +    this.bloomDMCache = CacheBuilder.newBuilder()
    +        .recordStats()
    +        .maximumSize(DEFAULT_CACHE_SIZE)
    +        .expireAfterAccess(DEFAULT_CACHE_EXPIRED_HOURS, TimeUnit.HOURS)
    +        .removalListener(listener)
    +        .build(cacheLoader);
    +  }
    +
    +  private static class SingletonHolder {
    +    public static final BloomDataMapCache INSTANCE = new BloomDataMapCache();
    +  }
    +
    +  /**
    +   * get instance
    +   */
    +  public static BloomDataMapCache getInstance() {
    --- End diff --
   
    I think no public is required in all method in this class


---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata pull request #2300: [CARBONDATA-2459][DataMap] Add cache for bloo...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user jackylk commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/2300#discussion_r187762718
 
    --- Diff: datamap/bloom/src/main/java/org/apache/carbondata/datamap/bloom/BloomDataMapCache.java ---
    @@ -0,0 +1,197 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *    http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +package org.apache.carbondata.datamap.bloom;
    +
    +import java.io.DataInputStream;
    +import java.io.EOFException;
    +import java.io.File;
    +import java.io.IOException;
    +import java.io.ObjectInputStream;
    +import java.io.Serializable;
    +import java.util.ArrayList;
    +import java.util.List;
    +import java.util.Objects;
    +import java.util.concurrent.TimeUnit;
    +
    +import org.apache.carbondata.common.logging.LogService;
    +import org.apache.carbondata.common.logging.LogServiceFactory;
    +import org.apache.carbondata.core.datastore.impl.FileFactory;
    +import org.apache.carbondata.core.util.CarbonUtil;
    +
    +import com.google.common.cache.CacheBuilder;
    +import com.google.common.cache.CacheLoader;
    +import com.google.common.cache.CacheStats;
    +import com.google.common.cache.LoadingCache;
    +import com.google.common.cache.RemovalListener;
    +import com.google.common.cache.RemovalNotification;
    +
    +public class BloomDataMapCache implements Serializable {
    +  private static final LogService LOGGER = LogServiceFactory.getLogService(
    +      BloomDataMapCache.class.getName());
    +  private static final long serialVersionUID = 20160822L;
    +  private static final long DEFAULT_CACHE_SIZE = 512 * 1024 * 1024;
    +  private static final int DEFAULT_CACHE_EXPIRED_HOURS = 2;
    +  private LoadingCache<CacheKey, List<BloomDMModel>> bloomDMCache = null;
    +
    +  private BloomDataMapCache() {
    +    RemovalListener<CacheKey, List<BloomDMModel>> listener =
    +        new RemovalListener<CacheKey, List<BloomDMModel>>() {
    +      @Override
    +      public void onRemoval(RemovalNotification<CacheKey, List<BloomDMModel>> notification) {
    +        LOGGER.error(
    --- End diff --
   
    Is it suitable to do error logging? Why not use info?


---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata pull request #2300: [CARBONDATA-2459][DataMap] Add cache for bloo...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user jackylk commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/2300#discussion_r187762731
 
    --- Diff: datamap/bloom/src/main/java/org/apache/carbondata/datamap/bloom/BloomDataMapCache.java ---
    @@ -0,0 +1,197 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *    http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +package org.apache.carbondata.datamap.bloom;
    +
    +import java.io.DataInputStream;
    +import java.io.EOFException;
    +import java.io.File;
    +import java.io.IOException;
    +import java.io.ObjectInputStream;
    +import java.io.Serializable;
    +import java.util.ArrayList;
    +import java.util.List;
    +import java.util.Objects;
    +import java.util.concurrent.TimeUnit;
    +
    +import org.apache.carbondata.common.logging.LogService;
    +import org.apache.carbondata.common.logging.LogServiceFactory;
    +import org.apache.carbondata.core.datastore.impl.FileFactory;
    +import org.apache.carbondata.core.util.CarbonUtil;
    +
    +import com.google.common.cache.CacheBuilder;
    +import com.google.common.cache.CacheLoader;
    +import com.google.common.cache.CacheStats;
    +import com.google.common.cache.LoadingCache;
    +import com.google.common.cache.RemovalListener;
    +import com.google.common.cache.RemovalNotification;
    +
    +public class BloomDataMapCache implements Serializable {
    +  private static final LogService LOGGER = LogServiceFactory.getLogService(
    +      BloomDataMapCache.class.getName());
    +  private static final long serialVersionUID = 20160822L;
    +  private static final long DEFAULT_CACHE_SIZE = 512 * 1024 * 1024;
    +  private static final int DEFAULT_CACHE_EXPIRED_HOURS = 2;
    +  private LoadingCache<CacheKey, List<BloomDMModel>> bloomDMCache = null;
    +
    +  private BloomDataMapCache() {
    +    RemovalListener<CacheKey, List<BloomDMModel>> listener =
    +        new RemovalListener<CacheKey, List<BloomDMModel>>() {
    +      @Override
    +      public void onRemoval(RemovalNotification<CacheKey, List<BloomDMModel>> notification) {
    +        LOGGER.error(
    +            String.format("Remove bloom datamap entry %s from cache due to %s",
    +                notification.getKey(), notification.getCause()));
    +      }
    +    };
    +    CacheLoader<CacheKey, List<BloomDMModel>> cacheLoader =
    +        new CacheLoader<CacheKey, List<BloomDMModel>>() {
    +      @Override
    +      public List<BloomDMModel> load(CacheKey key) throws Exception {
    +        LOGGER.error(String.format("Load bloom datamap entry %s to cache", key));
    +        return loadBloomDataMapModel(key);
    +      }
    +    };
    +
    +    this.bloomDMCache = CacheBuilder.newBuilder()
    +        .recordStats()
    +        .maximumSize(DEFAULT_CACHE_SIZE)
    +        .expireAfterAccess(DEFAULT_CACHE_EXPIRED_HOURS, TimeUnit.HOURS)
    +        .removalListener(listener)
    +        .build(cacheLoader);
    +  }
    +
    +  private static class SingletonHolder {
    --- End diff --
   
    why this static class is required for singleton?


---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata pull request #2300: [CARBONDATA-2459][DataMap] Add cache for bloo...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user jackylk commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/2300#discussion_r187762828
 
    --- Diff: datamap/bloom/src/main/java/org/apache/carbondata/datamap/bloom/BloomDataMapCache.java ---
    @@ -0,0 +1,197 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *    http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +package org.apache.carbondata.datamap.bloom;
    +
    +import java.io.DataInputStream;
    +import java.io.EOFException;
    +import java.io.File;
    +import java.io.IOException;
    +import java.io.ObjectInputStream;
    +import java.io.Serializable;
    +import java.util.ArrayList;
    +import java.util.List;
    +import java.util.Objects;
    +import java.util.concurrent.TimeUnit;
    +
    +import org.apache.carbondata.common.logging.LogService;
    +import org.apache.carbondata.common.logging.LogServiceFactory;
    +import org.apache.carbondata.core.datastore.impl.FileFactory;
    +import org.apache.carbondata.core.util.CarbonUtil;
    +
    +import com.google.common.cache.CacheBuilder;
    +import com.google.common.cache.CacheLoader;
    +import com.google.common.cache.CacheStats;
    +import com.google.common.cache.LoadingCache;
    +import com.google.common.cache.RemovalListener;
    +import com.google.common.cache.RemovalNotification;
    +
    +public class BloomDataMapCache implements Serializable {
    +  private static final LogService LOGGER = LogServiceFactory.getLogService(
    +      BloomDataMapCache.class.getName());
    +  private static final long serialVersionUID = 20160822L;
    +  private static final long DEFAULT_CACHE_SIZE = 512 * 1024 * 1024;
    +  private static final int DEFAULT_CACHE_EXPIRED_HOURS = 2;
    +  private LoadingCache<CacheKey, List<BloomDMModel>> bloomDMCache = null;
    +
    +  private BloomDataMapCache() {
    +    RemovalListener<CacheKey, List<BloomDMModel>> listener =
    +        new RemovalListener<CacheKey, List<BloomDMModel>>() {
    +      @Override
    +      public void onRemoval(RemovalNotification<CacheKey, List<BloomDMModel>> notification) {
    +        LOGGER.error(
    +            String.format("Remove bloom datamap entry %s from cache due to %s",
    +                notification.getKey(), notification.getCause()));
    +      }
    +    };
    +    CacheLoader<CacheKey, List<BloomDMModel>> cacheLoader =
    +        new CacheLoader<CacheKey, List<BloomDMModel>>() {
    +      @Override
    +      public List<BloomDMModel> load(CacheKey key) throws Exception {
    +        LOGGER.error(String.format("Load bloom datamap entry %s to cache", key));
    +        return loadBloomDataMapModel(key);
    +      }
    +    };
    +
    +    this.bloomDMCache = CacheBuilder.newBuilder()
    +        .recordStats()
    +        .maximumSize(DEFAULT_CACHE_SIZE)
    +        .expireAfterAccess(DEFAULT_CACHE_EXPIRED_HOURS, TimeUnit.HOURS)
    +        .removalListener(listener)
    +        .build(cacheLoader);
    +  }
    +
    +  private static class SingletonHolder {
    +    public static final BloomDataMapCache INSTANCE = new BloomDataMapCache();
    +  }
    +
    +  /**
    +   * get instance
    +   */
    +  public static BloomDataMapCache getInstance() {
    +    return SingletonHolder.INSTANCE;
    +  }
    +
    +  /**
    +   * for resolve from serialized
    +   */
    +  protected Object readResolve() {
    +    return getInstance();
    +  }
    +
    +  /**
    +   * load datamap from bloomindex file
    +   */
    +  private List<BloomDMModel> loadBloomDataMapModel(CacheKey cacheKey) {
    +    DataInputStream dataInStream = null;
    +    ObjectInputStream objectInStream = null;
    +    List<BloomDMModel> bloomDMModels = new ArrayList<BloomDMModel>();
    +    try {
    +      String indexFile = getIndexFileFromCacheKey(cacheKey);
    +      dataInStream = FileFactory.getDataInputStream(indexFile, FileFactory.getFileType(indexFile));
    +      objectInStream = new ObjectInputStream(dataInStream);
    +      try {
    +        BloomDMModel model = null;
    +        while ((model = (BloomDMModel) objectInStream.readObject()) != null) {
    +          bloomDMModels.add(model);
    +        }
    +      } catch (EOFException e) {
    +        LOGGER.info(String.format("Read %d bloom indices from %s",
    +            bloomDMModels.size(), indexFile));
    +      }
    +      this.bloomDMCache.put(cacheKey, bloomDMModels);
    +      return bloomDMModels;
    +    } catch (ClassNotFoundException | IOException e) {
    +      LOGGER.error(e, "Error occurs while reading bloom index");
    +      throw new RuntimeException("Error occurs while reading bloom index", e);
    +    } finally {
    +      clear();
    +      CarbonUtil.closeStreams(objectInStream, dataInStream);
    +    }
    +  }
    +
    +  /**
    +   * get bloom index file name from cachekey
    +   */
    +  private String getIndexFileFromCacheKey(CacheKey cacheKey) {
    +    return cacheKey.shardPath.concat(File.separator).concat(cacheKey.indexColumn)
    --- End diff --
   
    can you make an utility function to get the file path, I think this utility should be used here and in BloomDataMapWriter.initDataMapFile


---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata pull request #2300: [CARBONDATA-2459][DataMap] Add cache for bloo...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user xuchuanyin commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/2300#discussion_r187765672
 
    --- Diff: datamap/bloom/src/main/java/org/apache/carbondata/datamap/bloom/BloomDataMapCache.java ---
    @@ -0,0 +1,197 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *    http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +package org.apache.carbondata.datamap.bloom;
    +
    +import java.io.DataInputStream;
    +import java.io.EOFException;
    +import java.io.File;
    +import java.io.IOException;
    +import java.io.ObjectInputStream;
    +import java.io.Serializable;
    +import java.util.ArrayList;
    +import java.util.List;
    +import java.util.Objects;
    +import java.util.concurrent.TimeUnit;
    +
    +import org.apache.carbondata.common.logging.LogService;
    +import org.apache.carbondata.common.logging.LogServiceFactory;
    +import org.apache.carbondata.core.datastore.impl.FileFactory;
    +import org.apache.carbondata.core.util.CarbonUtil;
    +
    +import com.google.common.cache.CacheBuilder;
    +import com.google.common.cache.CacheLoader;
    +import com.google.common.cache.CacheStats;
    +import com.google.common.cache.LoadingCache;
    +import com.google.common.cache.RemovalListener;
    +import com.google.common.cache.RemovalNotification;
    +
    +public class BloomDataMapCache implements Serializable {
    --- End diff --
   
    OK, will fix it


---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata pull request #2300: [CARBONDATA-2459][DataMap] Add cache for bloo...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user xuchuanyin commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/2300#discussion_r187765674
 
    --- Diff: datamap/bloom/src/main/java/org/apache/carbondata/datamap/bloom/BloomDataMapCache.java ---
    @@ -0,0 +1,197 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *    http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +package org.apache.carbondata.datamap.bloom;
    +
    +import java.io.DataInputStream;
    +import java.io.EOFException;
    +import java.io.File;
    +import java.io.IOException;
    +import java.io.ObjectInputStream;
    +import java.io.Serializable;
    +import java.util.ArrayList;
    +import java.util.List;
    +import java.util.Objects;
    +import java.util.concurrent.TimeUnit;
    +
    +import org.apache.carbondata.common.logging.LogService;
    +import org.apache.carbondata.common.logging.LogServiceFactory;
    +import org.apache.carbondata.core.datastore.impl.FileFactory;
    +import org.apache.carbondata.core.util.CarbonUtil;
    +
    +import com.google.common.cache.CacheBuilder;
    +import com.google.common.cache.CacheLoader;
    +import com.google.common.cache.CacheStats;
    +import com.google.common.cache.LoadingCache;
    +import com.google.common.cache.RemovalListener;
    +import com.google.common.cache.RemovalNotification;
    +
    +public class BloomDataMapCache implements Serializable {
    +  private static final LogService LOGGER = LogServiceFactory.getLogService(
    +      BloomDataMapCache.class.getName());
    +  private static final long serialVersionUID = 20160822L;
    +  private static final long DEFAULT_CACHE_SIZE = 512 * 1024 * 1024;
    +  private static final int DEFAULT_CACHE_EXPIRED_HOURS = 2;
    +  private LoadingCache<CacheKey, List<BloomDMModel>> bloomDMCache = null;
    +
    +  private BloomDataMapCache() {
    +    RemovalListener<CacheKey, List<BloomDMModel>> listener =
    +        new RemovalListener<CacheKey, List<BloomDMModel>>() {
    +      @Override
    +      public void onRemoval(RemovalNotification<CacheKey, List<BloomDMModel>> notification) {
    +        LOGGER.error(
    +            String.format("Remove bloom datamap entry %s from cache due to %s",
    +                notification.getKey(), notification.getCause()));
    +      }
    +    };
    +    CacheLoader<CacheKey, List<BloomDMModel>> cacheLoader =
    +        new CacheLoader<CacheKey, List<BloomDMModel>>() {
    +      @Override
    +      public List<BloomDMModel> load(CacheKey key) throws Exception {
    +        LOGGER.error(String.format("Load bloom datamap entry %s to cache", key));
    +        return loadBloomDataMapModel(key);
    +      }
    +    };
    +
    +    this.bloomDMCache = CacheBuilder.newBuilder()
    +        .recordStats()
    +        .maximumSize(DEFAULT_CACHE_SIZE)
    +        .expireAfterAccess(DEFAULT_CACHE_EXPIRED_HOURS, TimeUnit.HOURS)
    +        .removalListener(listener)
    +        .build(cacheLoader);
    +  }
    +
    +  private static class SingletonHolder {
    +    public static final BloomDataMapCache INSTANCE = new BloomDataMapCache();
    +  }
    +
    +  /**
    +   * get instance
    +   */
    +  public static BloomDataMapCache getInstance() {
    --- End diff --
   
    yes, will fix it


---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata pull request #2300: [CARBONDATA-2459][DataMap] Add cache for bloo...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user xuchuanyin commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/2300#discussion_r187765690
 
    --- Diff: datamap/bloom/src/main/java/org/apache/carbondata/datamap/bloom/BloomDataMapCache.java ---
    @@ -0,0 +1,197 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *    http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +package org.apache.carbondata.datamap.bloom;
    +
    +import java.io.DataInputStream;
    +import java.io.EOFException;
    +import java.io.File;
    +import java.io.IOException;
    +import java.io.ObjectInputStream;
    +import java.io.Serializable;
    +import java.util.ArrayList;
    +import java.util.List;
    +import java.util.Objects;
    +import java.util.concurrent.TimeUnit;
    +
    +import org.apache.carbondata.common.logging.LogService;
    +import org.apache.carbondata.common.logging.LogServiceFactory;
    +import org.apache.carbondata.core.datastore.impl.FileFactory;
    +import org.apache.carbondata.core.util.CarbonUtil;
    +
    +import com.google.common.cache.CacheBuilder;
    +import com.google.common.cache.CacheLoader;
    +import com.google.common.cache.CacheStats;
    +import com.google.common.cache.LoadingCache;
    +import com.google.common.cache.RemovalListener;
    +import com.google.common.cache.RemovalNotification;
    +
    +public class BloomDataMapCache implements Serializable {
    +  private static final LogService LOGGER = LogServiceFactory.getLogService(
    +      BloomDataMapCache.class.getName());
    +  private static final long serialVersionUID = 20160822L;
    +  private static final long DEFAULT_CACHE_SIZE = 512 * 1024 * 1024;
    +  private static final int DEFAULT_CACHE_EXPIRED_HOURS = 2;
    +  private LoadingCache<CacheKey, List<BloomDMModel>> bloomDMCache = null;
    +
    +  private BloomDataMapCache() {
    +    RemovalListener<CacheKey, List<BloomDMModel>> listener =
    +        new RemovalListener<CacheKey, List<BloomDMModel>>() {
    +      @Override
    +      public void onRemoval(RemovalNotification<CacheKey, List<BloomDMModel>> notification) {
    +        LOGGER.error(
    --- End diff --
   
    Oh, just for debugging. will fix it


---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata pull request #2300: [CARBONDATA-2459][DataMap] Add cache for bloo...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user xuchuanyin commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/2300#discussion_r187765775
 
    --- Diff: datamap/bloom/src/main/java/org/apache/carbondata/datamap/bloom/BloomDataMapCache.java ---
    @@ -0,0 +1,197 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *    http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +package org.apache.carbondata.datamap.bloom;
    +
    +import java.io.DataInputStream;
    +import java.io.EOFException;
    +import java.io.File;
    +import java.io.IOException;
    +import java.io.ObjectInputStream;
    +import java.io.Serializable;
    +import java.util.ArrayList;
    +import java.util.List;
    +import java.util.Objects;
    +import java.util.concurrent.TimeUnit;
    +
    +import org.apache.carbondata.common.logging.LogService;
    +import org.apache.carbondata.common.logging.LogServiceFactory;
    +import org.apache.carbondata.core.datastore.impl.FileFactory;
    +import org.apache.carbondata.core.util.CarbonUtil;
    +
    +import com.google.common.cache.CacheBuilder;
    +import com.google.common.cache.CacheLoader;
    +import com.google.common.cache.CacheStats;
    +import com.google.common.cache.LoadingCache;
    +import com.google.common.cache.RemovalListener;
    +import com.google.common.cache.RemovalNotification;
    +
    +public class BloomDataMapCache implements Serializable {
    +  private static final LogService LOGGER = LogServiceFactory.getLogService(
    +      BloomDataMapCache.class.getName());
    +  private static final long serialVersionUID = 20160822L;
    +  private static final long DEFAULT_CACHE_SIZE = 512 * 1024 * 1024;
    +  private static final int DEFAULT_CACHE_EXPIRED_HOURS = 2;
    +  private LoadingCache<CacheKey, List<BloomDMModel>> bloomDMCache = null;
    +
    +  private BloomDataMapCache() {
    +    RemovalListener<CacheKey, List<BloomDMModel>> listener =
    +        new RemovalListener<CacheKey, List<BloomDMModel>>() {
    +      @Override
    +      public void onRemoval(RemovalNotification<CacheKey, List<BloomDMModel>> notification) {
    +        LOGGER.error(
    +            String.format("Remove bloom datamap entry %s from cache due to %s",
    +                notification.getKey(), notification.getCause()));
    +      }
    +    };
    +    CacheLoader<CacheKey, List<BloomDMModel>> cacheLoader =
    +        new CacheLoader<CacheKey, List<BloomDMModel>>() {
    +      @Override
    +      public List<BloomDMModel> load(CacheKey key) throws Exception {
    +        LOGGER.error(String.format("Load bloom datamap entry %s to cache", key));
    +        return loadBloomDataMapModel(key);
    +      }
    +    };
    +
    +    this.bloomDMCache = CacheBuilder.newBuilder()
    +        .recordStats()
    +        .maximumSize(DEFAULT_CACHE_SIZE)
    +        .expireAfterAccess(DEFAULT_CACHE_EXPIRED_HOURS, TimeUnit.HOURS)
    +        .removalListener(listener)
    +        .build(cacheLoader);
    +  }
    +
    +  private static class SingletonHolder {
    --- End diff --
   
    It use static-inner-class to implement singleton which is lazy and thread-safe.


---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata pull request #2300: [CARBONDATA-2459][DataMap] Add cache for bloo...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user xuchuanyin commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/2300#discussion_r187765856
 
    --- Diff: datamap/bloom/src/main/java/org/apache/carbondata/datamap/bloom/BloomDataMapCache.java ---
    @@ -0,0 +1,197 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *    http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +package org.apache.carbondata.datamap.bloom;
    +
    +import java.io.DataInputStream;
    +import java.io.EOFException;
    +import java.io.File;
    +import java.io.IOException;
    +import java.io.ObjectInputStream;
    +import java.io.Serializable;
    +import java.util.ArrayList;
    +import java.util.List;
    +import java.util.Objects;
    +import java.util.concurrent.TimeUnit;
    +
    +import org.apache.carbondata.common.logging.LogService;
    +import org.apache.carbondata.common.logging.LogServiceFactory;
    +import org.apache.carbondata.core.datastore.impl.FileFactory;
    +import org.apache.carbondata.core.util.CarbonUtil;
    +
    +import com.google.common.cache.CacheBuilder;
    +import com.google.common.cache.CacheLoader;
    +import com.google.common.cache.CacheStats;
    +import com.google.common.cache.LoadingCache;
    +import com.google.common.cache.RemovalListener;
    +import com.google.common.cache.RemovalNotification;
    +
    +public class BloomDataMapCache implements Serializable {
    +  private static final LogService LOGGER = LogServiceFactory.getLogService(
    +      BloomDataMapCache.class.getName());
    +  private static final long serialVersionUID = 20160822L;
    +  private static final long DEFAULT_CACHE_SIZE = 512 * 1024 * 1024;
    +  private static final int DEFAULT_CACHE_EXPIRED_HOURS = 2;
    +  private LoadingCache<CacheKey, List<BloomDMModel>> bloomDMCache = null;
    +
    +  private BloomDataMapCache() {
    +    RemovalListener<CacheKey, List<BloomDMModel>> listener =
    +        new RemovalListener<CacheKey, List<BloomDMModel>>() {
    +      @Override
    +      public void onRemoval(RemovalNotification<CacheKey, List<BloomDMModel>> notification) {
    +        LOGGER.error(
    +            String.format("Remove bloom datamap entry %s from cache due to %s",
    +                notification.getKey(), notification.getCause()));
    +      }
    +    };
    +    CacheLoader<CacheKey, List<BloomDMModel>> cacheLoader =
    +        new CacheLoader<CacheKey, List<BloomDMModel>>() {
    +      @Override
    +      public List<BloomDMModel> load(CacheKey key) throws Exception {
    +        LOGGER.error(String.format("Load bloom datamap entry %s to cache", key));
    +        return loadBloomDataMapModel(key);
    +      }
    +    };
    +
    +    this.bloomDMCache = CacheBuilder.newBuilder()
    +        .recordStats()
    +        .maximumSize(DEFAULT_CACHE_SIZE)
    +        .expireAfterAccess(DEFAULT_CACHE_EXPIRED_HOURS, TimeUnit.HOURS)
    +        .removalListener(listener)
    +        .build(cacheLoader);
    +  }
    +
    +  private static class SingletonHolder {
    +    public static final BloomDataMapCache INSTANCE = new BloomDataMapCache();
    +  }
    +
    +  /**
    +   * get instance
    +   */
    +  public static BloomDataMapCache getInstance() {
    +    return SingletonHolder.INSTANCE;
    +  }
    +
    +  /**
    +   * for resolve from serialized
    +   */
    +  protected Object readResolve() {
    +    return getInstance();
    +  }
    +
    +  /**
    +   * load datamap from bloomindex file
    +   */
    +  private List<BloomDMModel> loadBloomDataMapModel(CacheKey cacheKey) {
    +    DataInputStream dataInStream = null;
    +    ObjectInputStream objectInStream = null;
    +    List<BloomDMModel> bloomDMModels = new ArrayList<BloomDMModel>();
    +    try {
    +      String indexFile = getIndexFileFromCacheKey(cacheKey);
    +      dataInStream = FileFactory.getDataInputStream(indexFile, FileFactory.getFileType(indexFile));
    +      objectInStream = new ObjectInputStream(dataInStream);
    +      try {
    +        BloomDMModel model = null;
    +        while ((model = (BloomDMModel) objectInStream.readObject()) != null) {
    +          bloomDMModels.add(model);
    +        }
    +      } catch (EOFException e) {
    +        LOGGER.info(String.format("Read %d bloom indices from %s",
    +            bloomDMModels.size(), indexFile));
    +      }
    +      this.bloomDMCache.put(cacheKey, bloomDMModels);
    +      return bloomDMModels;
    +    } catch (ClassNotFoundException | IOException e) {
    +      LOGGER.error(e, "Error occurs while reading bloom index");
    +      throw new RuntimeException("Error occurs while reading bloom index", e);
    +    } finally {
    +      clear();
    +      CarbonUtil.closeStreams(objectInStream, dataInStream);
    +    }
    +  }
    +
    +  /**
    +   * get bloom index file name from cachekey
    +   */
    +  private String getIndexFileFromCacheKey(CacheKey cacheKey) {
    +    return cacheKey.shardPath.concat(File.separator).concat(cacheKey.indexColumn)
    +        .concat(BloomCoarseGrainDataMap.BLOOM_INDEX_SUFFIX);
    +  }
    +
    +  /**
    +   * get bloom datamap from cache
    +   */
    +  public List<BloomDMModel> getBloomDMModelByKey(CacheKey cacheKey) {
    +    return this.bloomDMCache.getUnchecked(cacheKey);
    +  }
    +
    +  /**
    +   * get cache status
    +   */
    +  private String getCacheStatus() {
    +    StringBuilder sb = new StringBuilder();
    +    CacheStats stats = this.bloomDMCache.stats();
    +    sb.append("hitCount: ").append(stats.hitCount()).append(System.lineSeparator())
    +        .append("hitRate: ").append(stats.hitCount()).append(System.lineSeparator())
    +        .append("loadCount: ").append(stats.loadCount()).append(System.lineSeparator())
    +        .append("averageLoadPenalty: ").append(stats.averageLoadPenalty())
    +        .append(System.lineSeparator())
    +        .append("evictionCount: ").append(stats.evictionCount());
    +    return sb.toString();
    +  }
    +
    +  /**
    +   * clear this cache
    +   */
    +  private void clear() {
    +    LOGGER.error(String.format("Current meta cache statistic: %s", getCacheStatus()));
    +    LOGGER.error("Trigger invalid all the cache for bloom datamap");
    +    this.bloomDMCache.invalidateAll();
    +  }
    +
    +  public static class CacheKey {
    +    private String shardPath;
    +    private String indexColumn;
    +
    +    public CacheKey(String shardPath, String indexColumn) {
    --- End diff --
   
    OK, will fix it and others which are similar to it


---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata pull request #2300: [CARBONDATA-2459][DataMap] Add cache for bloo...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user xuchuanyin commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/2300#discussion_r187766033
 
    --- Diff: datamap/bloom/src/main/java/org/apache/carbondata/datamap/bloom/BloomDataMapCache.java ---
    @@ -0,0 +1,197 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *    http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +package org.apache.carbondata.datamap.bloom;
    +
    +import java.io.DataInputStream;
    +import java.io.EOFException;
    +import java.io.File;
    +import java.io.IOException;
    +import java.io.ObjectInputStream;
    +import java.io.Serializable;
    +import java.util.ArrayList;
    +import java.util.List;
    +import java.util.Objects;
    +import java.util.concurrent.TimeUnit;
    +
    +import org.apache.carbondata.common.logging.LogService;
    +import org.apache.carbondata.common.logging.LogServiceFactory;
    +import org.apache.carbondata.core.datastore.impl.FileFactory;
    +import org.apache.carbondata.core.util.CarbonUtil;
    +
    +import com.google.common.cache.CacheBuilder;
    +import com.google.common.cache.CacheLoader;
    +import com.google.common.cache.CacheStats;
    +import com.google.common.cache.LoadingCache;
    +import com.google.common.cache.RemovalListener;
    +import com.google.common.cache.RemovalNotification;
    +
    +public class BloomDataMapCache implements Serializable {
    +  private static final LogService LOGGER = LogServiceFactory.getLogService(
    +      BloomDataMapCache.class.getName());
    +  private static final long serialVersionUID = 20160822L;
    +  private static final long DEFAULT_CACHE_SIZE = 512 * 1024 * 1024;
    +  private static final int DEFAULT_CACHE_EXPIRED_HOURS = 2;
    +  private LoadingCache<CacheKey, List<BloomDMModel>> bloomDMCache = null;
    +
    +  private BloomDataMapCache() {
    +    RemovalListener<CacheKey, List<BloomDMModel>> listener =
    +        new RemovalListener<CacheKey, List<BloomDMModel>>() {
    +      @Override
    +      public void onRemoval(RemovalNotification<CacheKey, List<BloomDMModel>> notification) {
    +        LOGGER.error(
    +            String.format("Remove bloom datamap entry %s from cache due to %s",
    +                notification.getKey(), notification.getCause()));
    +      }
    +    };
    +    CacheLoader<CacheKey, List<BloomDMModel>> cacheLoader =
    +        new CacheLoader<CacheKey, List<BloomDMModel>>() {
    +      @Override
    +      public List<BloomDMModel> load(CacheKey key) throws Exception {
    +        LOGGER.error(String.format("Load bloom datamap entry %s to cache", key));
    +        return loadBloomDataMapModel(key);
    +      }
    +    };
    +
    +    this.bloomDMCache = CacheBuilder.newBuilder()
    +        .recordStats()
    +        .maximumSize(DEFAULT_CACHE_SIZE)
    +        .expireAfterAccess(DEFAULT_CACHE_EXPIRED_HOURS, TimeUnit.HOURS)
    +        .removalListener(listener)
    +        .build(cacheLoader);
    +  }
    +
    +  private static class SingletonHolder {
    +    public static final BloomDataMapCache INSTANCE = new BloomDataMapCache();
    +  }
    +
    +  /**
    +   * get instance
    +   */
    +  public static BloomDataMapCache getInstance() {
    +    return SingletonHolder.INSTANCE;
    +  }
    +
    +  /**
    +   * for resolve from serialized
    +   */
    +  protected Object readResolve() {
    +    return getInstance();
    +  }
    +
    +  /**
    +   * load datamap from bloomindex file
    +   */
    +  private List<BloomDMModel> loadBloomDataMapModel(CacheKey cacheKey) {
    +    DataInputStream dataInStream = null;
    +    ObjectInputStream objectInStream = null;
    +    List<BloomDMModel> bloomDMModels = new ArrayList<BloomDMModel>();
    +    try {
    +      String indexFile = getIndexFileFromCacheKey(cacheKey);
    +      dataInStream = FileFactory.getDataInputStream(indexFile, FileFactory.getFileType(indexFile));
    +      objectInStream = new ObjectInputStream(dataInStream);
    +      try {
    +        BloomDMModel model = null;
    +        while ((model = (BloomDMModel) objectInStream.readObject()) != null) {
    +          bloomDMModels.add(model);
    +        }
    +      } catch (EOFException e) {
    +        LOGGER.info(String.format("Read %d bloom indices from %s",
    +            bloomDMModels.size(), indexFile));
    +      }
    +      this.bloomDMCache.put(cacheKey, bloomDMModels);
    +      return bloomDMModels;
    +    } catch (ClassNotFoundException | IOException e) {
    +      LOGGER.error(e, "Error occurs while reading bloom index");
    +      throw new RuntimeException("Error occurs while reading bloom index", e);
    +    } finally {
    +      clear();
    +      CarbonUtil.closeStreams(objectInStream, dataInStream);
    +    }
    +  }
    +
    +  /**
    +   * get bloom index file name from cachekey
    +   */
    +  private String getIndexFileFromCacheKey(CacheKey cacheKey) {
    +    return cacheKey.shardPath.concat(File.separator).concat(cacheKey.indexColumn)
    +        .concat(BloomCoarseGrainDataMap.BLOOM_INDEX_SUFFIX);
    +  }
    +
    +  /**
    +   * get bloom datamap from cache
    +   */
    +  public List<BloomDMModel> getBloomDMModelByKey(CacheKey cacheKey) {
    +    return this.bloomDMCache.getUnchecked(cacheKey);
    +  }
    +
    +  /**
    +   * get cache status
    +   */
    +  private String getCacheStatus() {
    +    StringBuilder sb = new StringBuilder();
    +    CacheStats stats = this.bloomDMCache.stats();
    +    sb.append("hitCount: ").append(stats.hitCount()).append(System.lineSeparator())
    +        .append("hitRate: ").append(stats.hitCount()).append(System.lineSeparator())
    +        .append("loadCount: ").append(stats.loadCount()).append(System.lineSeparator())
    +        .append("averageLoadPenalty: ").append(stats.averageLoadPenalty())
    +        .append(System.lineSeparator())
    +        .append("evictionCount: ").append(stats.evictionCount());
    +    return sb.toString();
    +  }
    +
    +  /**
    +   * clear this cache
    +   */
    +  private void clear() {
    +    LOGGER.error(String.format("Current meta cache statistic: %s", getCacheStatus()));
    +    LOGGER.error("Trigger invalid all the cache for bloom datamap");
    +    this.bloomDMCache.invalidateAll();
    +  }
    +
    +  public static class CacheKey {
    --- End diff --
   
    No, in BloomDataMap class we need to create a CacheKey instance to get value from cache.
    However I found another place similar with this and fixed it


---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata pull request #2300: [CARBONDATA-2459][DataMap] Add cache for bloo...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user xuchuanyin commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/2300#discussion_r187766217
 
    --- Diff: datamap/bloom/src/main/java/org/apache/carbondata/datamap/bloom/BloomDataMapCache.java ---
    @@ -0,0 +1,197 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *    http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +package org.apache.carbondata.datamap.bloom;
    +
    +import java.io.DataInputStream;
    +import java.io.EOFException;
    +import java.io.File;
    +import java.io.IOException;
    +import java.io.ObjectInputStream;
    +import java.io.Serializable;
    +import java.util.ArrayList;
    +import java.util.List;
    +import java.util.Objects;
    +import java.util.concurrent.TimeUnit;
    +
    +import org.apache.carbondata.common.logging.LogService;
    +import org.apache.carbondata.common.logging.LogServiceFactory;
    +import org.apache.carbondata.core.datastore.impl.FileFactory;
    +import org.apache.carbondata.core.util.CarbonUtil;
    +
    +import com.google.common.cache.CacheBuilder;
    +import com.google.common.cache.CacheLoader;
    +import com.google.common.cache.CacheStats;
    +import com.google.common.cache.LoadingCache;
    +import com.google.common.cache.RemovalListener;
    +import com.google.common.cache.RemovalNotification;
    +
    +public class BloomDataMapCache implements Serializable {
    +  private static final LogService LOGGER = LogServiceFactory.getLogService(
    +      BloomDataMapCache.class.getName());
    +  private static final long serialVersionUID = 20160822L;
    +  private static final long DEFAULT_CACHE_SIZE = 512 * 1024 * 1024;
    +  private static final int DEFAULT_CACHE_EXPIRED_HOURS = 2;
    +  private LoadingCache<CacheKey, List<BloomDMModel>> bloomDMCache = null;
    +
    +  private BloomDataMapCache() {
    +    RemovalListener<CacheKey, List<BloomDMModel>> listener =
    +        new RemovalListener<CacheKey, List<BloomDMModel>>() {
    +      @Override
    +      public void onRemoval(RemovalNotification<CacheKey, List<BloomDMModel>> notification) {
    +        LOGGER.error(
    +            String.format("Remove bloom datamap entry %s from cache due to %s",
    +                notification.getKey(), notification.getCause()));
    +      }
    +    };
    +    CacheLoader<CacheKey, List<BloomDMModel>> cacheLoader =
    +        new CacheLoader<CacheKey, List<BloomDMModel>>() {
    +      @Override
    +      public List<BloomDMModel> load(CacheKey key) throws Exception {
    +        LOGGER.error(String.format("Load bloom datamap entry %s to cache", key));
    +        return loadBloomDataMapModel(key);
    +      }
    +    };
    +
    +    this.bloomDMCache = CacheBuilder.newBuilder()
    +        .recordStats()
    +        .maximumSize(DEFAULT_CACHE_SIZE)
    +        .expireAfterAccess(DEFAULT_CACHE_EXPIRED_HOURS, TimeUnit.HOURS)
    +        .removalListener(listener)
    +        .build(cacheLoader);
    +  }
    +
    +  private static class SingletonHolder {
    +    public static final BloomDataMapCache INSTANCE = new BloomDataMapCache();
    +  }
    +
    +  /**
    +   * get instance
    +   */
    +  public static BloomDataMapCache getInstance() {
    +    return SingletonHolder.INSTANCE;
    +  }
    +
    +  /**
    +   * for resolve from serialized
    +   */
    +  protected Object readResolve() {
    +    return getInstance();
    +  }
    +
    +  /**
    +   * load datamap from bloomindex file
    +   */
    +  private List<BloomDMModel> loadBloomDataMapModel(CacheKey cacheKey) {
    +    DataInputStream dataInStream = null;
    +    ObjectInputStream objectInStream = null;
    +    List<BloomDMModel> bloomDMModels = new ArrayList<BloomDMModel>();
    +    try {
    +      String indexFile = getIndexFileFromCacheKey(cacheKey);
    +      dataInStream = FileFactory.getDataInputStream(indexFile, FileFactory.getFileType(indexFile));
    +      objectInStream = new ObjectInputStream(dataInStream);
    +      try {
    +        BloomDMModel model = null;
    +        while ((model = (BloomDMModel) objectInStream.readObject()) != null) {
    +          bloomDMModels.add(model);
    +        }
    +      } catch (EOFException e) {
    +        LOGGER.info(String.format("Read %d bloom indices from %s",
    +            bloomDMModels.size(), indexFile));
    +      }
    +      this.bloomDMCache.put(cacheKey, bloomDMModels);
    +      return bloomDMModels;
    +    } catch (ClassNotFoundException | IOException e) {
    +      LOGGER.error(e, "Error occurs while reading bloom index");
    +      throw new RuntimeException("Error occurs while reading bloom index", e);
    +    } finally {
    +      clear();
    +      CarbonUtil.closeStreams(objectInStream, dataInStream);
    +    }
    +  }
    +
    +  /**
    +   * get bloom index file name from cachekey
    +   */
    +  private String getIndexFileFromCacheKey(CacheKey cacheKey) {
    +    return cacheKey.shardPath.concat(File.separator).concat(cacheKey.indexColumn)
    --- End diff --
   
    OK, since they both depend on BloomDataMap class to get the bloomindex suffix information, so I extract the method to BloomDataMap class.


---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata issue #2300: [CARBONDATA-2459][DataMap] Add cache for bloom filte...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:

    https://github.com/apache/carbondata/pull/2300
 
    SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/4892/



---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata issue #2300: [CARBONDATA-2459][DataMap] Add cache for bloom filte...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:

    https://github.com/apache/carbondata/pull/2300
 
    Build Failed  with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/5846/



---
12