[jira] [Commented] (CARBONDATA-302) 7. Add DataWriterProcessorStep which reads the data from sort temp files and creates carbondata files.

classic Classic list List threaded Threaded
1 message Options
Reply | Threaded
Open this post in threaded view
|

[jira] [Commented] (CARBONDATA-302) 7. Add DataWriterProcessorStep which reads the data from sort temp files and creates carbondata files.

Akash R Nilugal (Jira)

    [ https://issues.apache.org/jira/browse/CARBONDATA-302?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15608849#comment-15608849 ]

ASF GitHub Bot commented on CARBONDATA-302:
-------------------------------------------

Github user jackylk commented on a diff in the pull request:

    https://github.com/apache/incubator-carbondata/pull/251#discussion_r85157225
 
    --- Diff: processing/src/main/java/org/apache/carbondata/processing/newflow/steps/writer/DataWriterProcessorStepImpl.java ---
    @@ -0,0 +1,360 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one
    + * or more contributor license agreements.  See the NOTICE file
    + * distributed with this work for additional information
    + * regarding copyright ownership.  The ASF licenses this file
    + * to you under the Apache License, Version 2.0 (the
    + * "License"); you may not use this file except in compliance
    + * with the License.  You may obtain a copy of the License at
    + *
    + *    http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing,
    + * software distributed under the License is distributed on an
    + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
    + * KIND, either express or implied.  See the License for the
    + * specific language governing permissions and limitations
    + * under the License.
    + */
    +package org.apache.carbondata.processing.newflow.steps.writer;
    +
    +import java.io.File;
    +import java.util.ArrayList;
    +import java.util.HashMap;
    +import java.util.Iterator;
    +import java.util.List;
    +import java.util.Map;
    +
    +import org.apache.carbondata.common.logging.LogService;
    +import org.apache.carbondata.common.logging.LogServiceFactory;
    +import org.apache.carbondata.core.carbon.CarbonTableIdentifier;
    +import org.apache.carbondata.core.carbon.datastore.block.SegmentProperties;
    +import org.apache.carbondata.core.carbon.metadata.CarbonMetadata;
    +import org.apache.carbondata.core.carbon.metadata.schema.table.CarbonTable;
    +import org.apache.carbondata.core.carbon.metadata.schema.table.column.ColumnSchema;
    +import org.apache.carbondata.core.carbon.path.CarbonStorePath;
    +import org.apache.carbondata.core.carbon.path.CarbonTablePath;
    +import org.apache.carbondata.core.constants.CarbonCommonConstants;
    +import org.apache.carbondata.core.constants.IgnoreDictionary;
    +import org.apache.carbondata.core.keygenerator.KeyGenerator;
    +import org.apache.carbondata.core.util.CarbonProperties;
    +import org.apache.carbondata.core.util.CarbonTimeStatisticsFactory;
    +import org.apache.carbondata.core.util.CarbonUtil;
    +import org.apache.carbondata.processing.datatypes.GenericDataType;
    +import org.apache.carbondata.processing.newflow.AbstractDataLoadProcessorStep;
    +import org.apache.carbondata.processing.newflow.CarbonDataLoadConfiguration;
    +import org.apache.carbondata.processing.newflow.DataField;
    +import org.apache.carbondata.processing.newflow.constants.DataLoadProcessorConstants;
    +import org.apache.carbondata.processing.newflow.exception.CarbonDataLoadingException;
    +import org.apache.carbondata.processing.newflow.row.CarbonRow;
    +import org.apache.carbondata.processing.newflow.row.CarbonRowBatch;
    +import org.apache.carbondata.processing.store.CarbonDataFileAttributes;
    +import org.apache.carbondata.processing.store.CarbonFactDataHandlerModel;
    +import org.apache.carbondata.processing.store.CarbonFactHandler;
    +import org.apache.carbondata.processing.store.CarbonFactHandlerFactory;
    +import org.apache.carbondata.processing.store.writer.exception.CarbonDataWriterException;
    +import org.apache.carbondata.processing.util.CarbonDataProcessorUtil;
    +
    +/**
    + * It reads data from sorted files which are generated in previous sort step.
    + * And it writes data to carbondata file. It also generates mdk key while writing to carbondata file
    + */
    +public class DataWriterProcessorStepImpl extends AbstractDataLoadProcessorStep {
    +
    +  private static final LogService LOGGER =
    +      LogServiceFactory.getLogService(DataWriterProcessorStepImpl.class.getName());
    +
    +  private String storeLocation;
    +
    +  private boolean[] isUseInvertedIndex;
    +
    +  private int[] dimLens;
    +
    +  private int dimensionCount;
    +
    +  private List<ColumnSchema> wrapperColumnSchema;
    +
    +  private int[] colCardinality;
    +
    +  private SegmentProperties segmentProperties;
    +
    +  private KeyGenerator keyGenerator;
    +
    +  private CarbonFactHandler dataHandler;
    +
    +  private Map<Integer, GenericDataType> complexIndexMap;
    +
    +  private int noDictionaryCount;
    +
    +  private int complexDimensionCount;
    +
    +  private int measureCount;
    +
    +  private long readCounter;
    +
    +  private long writeCounter;
    +
    +  private int measureIndex = IgnoreDictionary.MEASURES_INDEX_IN_ROW.getIndex();
    +
    +  private int noDimByteArrayIndex = IgnoreDictionary.BYTE_ARRAY_INDEX_IN_ROW.getIndex();
    +
    +  private int dimsArrayIndex = IgnoreDictionary.DIMENSION_INDEX_IN_ROW.getIndex();
    +
    +  public DataWriterProcessorStepImpl(CarbonDataLoadConfiguration configuration,
    +      AbstractDataLoadProcessorStep child) {
    +    super(configuration, child);
    +  }
    +
    +  @Override public DataField[] getOutput() {
    +    return child.getOutput();
    +  }
    +
    +  @Override public void initialize() throws CarbonDataLoadingException {
    +    CarbonTableIdentifier tableIdentifier =
    +        configuration.getTableIdentifier().getCarbonTableIdentifier();
    +
    +    storeLocation = CarbonDataProcessorUtil
    +        .getLocalDataFolderLocation(tableIdentifier.getDatabaseName(),
    +            tableIdentifier.getTableName(), String.valueOf(configuration.getTaskNo()),
    +            configuration.getPartitionId(), configuration.getSegmentId() + "", false);
    +    isUseInvertedIndex =
    +        CarbonDataProcessorUtil.getIsUseInvertedIndex(configuration.getDataFields());
    +
    +    if (!(new File(storeLocation).exists())) {
    +      LOGGER.error("Local data load folder location does not exist: " + storeLocation);
    +      return;
    +    }
    +
    +    int[] dimLensWithComplex =
    +        (int[]) configuration.getDataLoadProperty(DataLoadProcessorConstants.DIMENSION_LENGTHS);
    +    List<Integer> dimsLenList = new ArrayList<Integer>();
    +    for (int eachDimLen : dimLensWithComplex) {
    +      if (eachDimLen != 0) dimsLenList.add(eachDimLen);
    +    }
    +    dimLens = new int[dimsLenList.size()];
    +    for (int i = 0; i < dimsLenList.size(); i++) {
    +      dimLens[i] = dimsLenList.get(i);
    +    }
    +
    +    this.dimensionCount = configuration.getDimensionCount();
    +    this.noDictionaryCount = configuration.getNoDictionaryCount();
    +    this.complexDimensionCount = configuration.getComplexDimensionCount();
    +    this.measureCount = configuration.getMeasureCount();
    +
    +    int simpleDimsCount = this.dimensionCount - complexDimensionCount;
    +    int[] simpleDimsLen = new int[simpleDimsCount];
    +    for (int i = 0; i < simpleDimsCount; i++) {
    +      simpleDimsLen[i] = dimLens[i];
    +    }
    +
    +    CarbonTable carbonTable = CarbonMetadata.getInstance().getCarbonTable(
    +        tableIdentifier.getDatabaseName() + CarbonCommonConstants.UNDERSCORE + tableIdentifier
    +            .getTableName());
    +    wrapperColumnSchema = CarbonUtil
    +        .getColumnSchemaList(carbonTable.getDimensionByTableName(tableIdentifier.getTableName()),
    +            carbonTable.getMeasureByTableName(tableIdentifier.getTableName()));
    +    colCardinality = CarbonUtil.getFormattedCardinality(dimLensWithComplex, wrapperColumnSchema);
    +    segmentProperties = new SegmentProperties(wrapperColumnSchema, colCardinality);
    +    // Actual primitive dimension used to generate start & end key
    +
    +    keyGenerator = segmentProperties.getDimensionKeyGenerator();
    +
    +    //To Set MDKey Index of each primitive type in complex type
    +    int surrIndex = simpleDimsCount;
    +    Iterator<Map.Entry<String, GenericDataType>> complexMap =
    +        CarbonDataProcessorUtil.getComplexTypesMap(configuration.getDataFields()).entrySet()
    +            .iterator();
    +    complexIndexMap = new HashMap<Integer, GenericDataType>(complexDimensionCount);
    +    while (complexMap.hasNext()) {
    +      Map.Entry<String, GenericDataType> complexDataType = complexMap.next();
    +      complexDataType.getValue().setOutputArrayIndex(0);
    +      complexIndexMap.put(simpleDimsCount, complexDataType.getValue());
    +      simpleDimsCount++;
    +      List<GenericDataType> primitiveTypes = new ArrayList<GenericDataType>();
    +      complexDataType.getValue().getAllPrimitiveChildren(primitiveTypes);
    +      for (GenericDataType eachPrimitive : primitiveTypes) {
    +        eachPrimitive.setSurrogateIndex(surrIndex++);
    +      }
    +    }
    +  }
    +
    +  private void initDataHandler() {
    --- End diff --
   
    can we move all these initialization logic into CarbonFactHandlerFactory? I think the handlerModel is created from configuration, so passing configuration to the factory can create both model and the handler.


> 7. Add DataWriterProcessorStep which reads the data from sort temp files and creates carbondata files.
> ------------------------------------------------------------------------------------------------------
>
>                 Key: CARBONDATA-302
>                 URL: https://issues.apache.org/jira/browse/CARBONDATA-302
>             Project: CarbonData
>          Issue Type: Sub-task
>            Reporter: Ravindra Pesala
>            Assignee: Ravindra Pesala
>             Fix For: 0.3.0-incubating
>
>
> Add DataWriterProcessorStep which reads the data from sort temp files and merge sort it, and apply mdk generator on key and creates carbondata files.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)