[GitHub] [carbondata] Karan-c980 opened a new pull request #3834: Sdk iud

classic Classic list List threaded Threaded
82 messages Options
12345
Reply | Threaded
Open this post in threaded view
|

[GitHub] [carbondata] CarbonDataQA1 commented on pull request #3834: [CARBONDATA-3865] Implementation of delete/update feature in carbondata SDK.

GitBox

CarbonDataQA1 commented on pull request #3834:
URL: https://github.com/apache/carbondata/pull/3834#issuecomment-669860500


   Build Success with Spark 2.3.4, Please check CI http://121.244.95.60:12545/job/ApacheCarbonPRBuilder2.3/3630/
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[hidden email]


Reply | Threaded
Open this post in threaded view
|

[GitHub] [carbondata] kunal642 commented on a change in pull request #3834: [CARBONDATA-3865] Implementation of delete/update feature in carbondata SDK.

GitBox
In reply to this post by GitBox

kunal642 commented on a change in pull request #3834:
URL: https://github.com/apache/carbondata/pull/3834#discussion_r475244586



##########
File path: sdk/sdk/src/main/java/org/apache/carbondata/sdk/file/CarbonIUD.java
##########
@@ -0,0 +1,275 @@
+package org.apache.carbondata.sdk.file;
+
+import org.apache.carbondata.common.exceptions.sql.InvalidLoadOptionException;
+import org.apache.carbondata.core.constants.CarbonCommonConstants;
+import org.apache.carbondata.core.datastore.impl.FileFactory;
+import org.apache.carbondata.core.metadata.datatype.DataType;
+import org.apache.carbondata.core.metadata.datatype.Field;
+import org.apache.carbondata.core.scan.expression.ColumnExpression;
+import org.apache.carbondata.core.scan.expression.Expression;
+import org.apache.carbondata.core.scan.expression.LiteralExpression;
+import org.apache.carbondata.core.scan.expression.conditional.EqualToExpression;
+import org.apache.carbondata.core.scan.expression.logical.AndExpression;
+import org.apache.carbondata.hadoop.api.CarbonTableOutputFormat;
+import org.apache.carbondata.hadoop.internal.ObjectArrayWritable;
+
+import java.io.File;
+import java.io.FilenameFilter;
+import java.io.IOException;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+
+import com.jcraft.jsch.IO;
+import org.apache.hadoop.io.NullWritable;
+import org.apache.hadoop.mapred.JobConf;
+import org.apache.hadoop.mapreduce.Job;
+import org.apache.hadoop.mapreduce.RecordWriter;
+import org.apache.hadoop.mapreduce.TaskAttemptID;
+import org.apache.hadoop.mapreduce.task.TaskAttemptContextImpl;
+
+public class CarbonIUD {
+
+  private final Map<String, Map<String, String>> filterColumnToValueMapping;
+  private final Map<String, Map<String, String>> updateColumnToValueMapping;
+
+  private CarbonIUD() {
+    filterColumnToValueMapping = new HashMap<>();
+    updateColumnToValueMapping = new HashMap<>();
+  }
+
+  /**
+   * @return CarbonIUD object
+   */
+  public static CarbonIUD getInstance() {
+    return new CarbonIUD();
+  }
+
+  /**
+   * @param path   is the segment path on which delete is performed
+   * @param column is the columeName on which records have to be deleted
+   * @param value  of column on which the records have to be deleted
+   * @return CarbonIUD object
+   * <p>
+   * for eg: DELETE WHERE column = value
+   */
+  public CarbonIUD delete(String path, String column, String value) {
+    prepareDelete(path, column, value);
+    return this;
+  }
+
+  /**
+   * @param path             is the segment path on which delete is performed
+   * @param filterExpression is the expression to delete the records
+   * @throws IOException
+   * @throws InterruptedException
+   */
+  public void delete(String path, Expression filterExpression)
+      throws IOException, InterruptedException {
+    CarbonReader reader = CarbonReader.builder(path, "_temp")
+        .projection(new String[] { CarbonCommonConstants.CARBON_IMPLICIT_COLUMN_TUPLEID })
+        .filter(filterExpression).build();
+
+    RecordWriter<NullWritable, ObjectArrayWritable> deleteDeltaWriter =
+        CarbonTableOutputFormat.getDeleteDeltaRecordWriter(path);
+    ObjectArrayWritable writable = new ObjectArrayWritable();
+    while (reader.hasNext()) {
+      Object[] row = (Object[]) reader.readNextRow();
+      writable.set(row);
+      deleteDeltaWriter.write(NullWritable.get(), writable);
+    }
+    deleteDeltaWriter.close(null);
+    reader.close();
+  }
+
+  /**
+   * Calling this method will start the execution of delete process
+   * @throws IOException
+   * @throws InterruptedException
+   */
+  public void closeDelete() throws IOException, InterruptedException {
+    for (Map.Entry<String, Map<String, String>> path : this.filterColumnToValueMapping.entrySet()) {
+      deleteExecution(path.getKey());
+    }
+    this.filterColumnToValueMapping.clear();
+  }
+
+  /**
+   * @param path      is the segment path on which update is performed
+   * @param column    is the columeName on which records have to be updated
+   * @param value     of column on which the records have to be updated
+   * @param updColumn is the name of updatedColumn
+   * @param updValue  is the value of updatedColumn
+   * @return CarbonUID
+   * <p>
+   * for eg: UPDATE updColumn = updValue WHERE column = value
+   */
+  public CarbonIUD update(String path, String column, String value, String updColumn,
+      String updValue) {
+    prepareUpdate(path, column, value, updColumn, updValue);
+    return this;
+  }
+
+  /**
+   * @param path                        is the segment path on which update is performed.
+   * @param filterExpression            is the expression object to update the records
+   * @param updatedColumnToValueMapping contains the mapping of updatedColumns to updatedValues
+   * @throws IOException
+   * @throws InterruptedException
+   * @throws InvalidLoadOptionException
+   */
+  public void update(String path, Expression filterExpression,
+      Map<String, String> updatedColumnToValueMapping)
+      throws IOException, InterruptedException, InvalidLoadOptionException {
+    File[] indexFiles = getCarbonIndexFile(path);
+    Schema schema = CarbonSchemaReader.readSchema(indexFiles[0].getAbsolutePath()).asOriginOrder();
+    Field[] fields = schema.getFields();
+    String[] projectionColumns = new String[fields.length + 1];
+    for (int i = 0; i < fields.length; i++) {
+      projectionColumns[i] = (fields[i].getFieldName());
+    }
+    projectionColumns[projectionColumns.length - 1] =
+        CarbonCommonConstants.CARBON_IMPLICIT_COLUMN_TUPLEID;
+
+    CarbonWriter writer =
+        CarbonWriter.builder().outputPath(path).withCsvInput(schema).writtenBy("CarbonIUD").build();
+
+    CarbonReader reader =
+        CarbonReader.builder(path, "_temp").projection(projectionColumns).filter(filterExpression)
+            .build();
+
+    RecordWriter<NullWritable, ObjectArrayWritable> deleteDeltaWriter =
+        CarbonTableOutputFormat.getDeleteDeltaRecordWriter(path);
+    ObjectArrayWritable writable = new ObjectArrayWritable();
+
+    while (reader.hasNext()) {
+      Object[] row = (Object[]) reader.readNextRow();
+      writable.set(Arrays.copyOfRange(row, row.length - 1, row.length));
+      for (Map.Entry<String, String> column : updatedColumnToValueMapping.entrySet()) {
+        row[getColumnIndex(fields, column.getKey())] = column.getValue();
+      }
+      writer.write(Arrays.copyOfRange(row, 0, row.length - 1));
+      deleteDeltaWriter.write(NullWritable.get(), writable);
+    }
+    deleteDeltaWriter.close(null);
+    writer.close();
+    reader.close();
+  }
+
+  /**
+   * Calling this method will start the execution of update process
+   * @throws IOException
+   * @throws InterruptedException
+   * @throws InvalidLoadOptionException
+   */
+  public void closeUpdate() throws IOException, InterruptedException, InvalidLoadOptionException {
+    for (Map.Entry<String, Map<String, String>> path : this.filterColumnToValueMapping.entrySet()) {
+      if (this.updateColumnToValueMapping.containsKey(path.getKey())) {
+        updateExecution(path.getKey());
+      }
+    }
+    this.filterColumnToValueMapping.clear();
+    this.updateColumnToValueMapping.clear();
+  }
+
+  private void updateExecution(String path)
+      throws IOException, InterruptedException, InvalidLoadOptionException {
+    Expression filterExpression = getExpression(path);
+    update(path, filterExpression, this.updateColumnToValueMapping.get(path));
+  }
+
+  private void deleteExecution(String path) throws IOException, InterruptedException {
+    Expression filterExpression = getExpression(path);
+    delete(path, filterExpression);
+  }
+
+  private void prepareUpdate(String path, String column, String value, String updColumn,
+      String updValue) {
+    prepareDelete(path, column, value);
+    updColumn = updColumn.toLowerCase().trim();
+    if (this.updateColumnToValueMapping.containsKey(path)) {
+      this.updateColumnToValueMapping.get(path).put(updColumn, updValue);
+    } else {
+      Map<String, String> columnToValue = new HashMap<>();
+      columnToValue.put(updColumn, updValue);
+      this.updateColumnToValueMapping.put(path, columnToValue);
+    }
+  }
+
+  private void prepareDelete(String path, String column, String value) {
+    column = column.toLowerCase().trim();
+    if (this.filterColumnToValueMapping.containsKey(path)) {
+      this.filterColumnToValueMapping.get(path).put(column, value);
+    } else {
+      Map<String, String> columnToValueMapping = new HashMap<>();
+      columnToValueMapping.put(column, value);
+      this.filterColumnToValueMapping.put(path, columnToValueMapping);
+    }
+  }
+
+  private Expression getExpression(String path) throws IOException {
+    File[] indexFiles = getCarbonIndexFile(path);
+    Schema schema = CarbonSchemaReader.readSchema(indexFiles[0].getAbsolutePath()).asOriginOrder();
+    Field[] fields = schema.getFields();
+    List<EqualToExpression> listOfExpressions = new ArrayList<>();
+    for (Map.Entry<String, String> column : this.filterColumnToValueMapping.get(path).entrySet()) {
+      DataType dataType = getColumnDataType(fields, column.getKey());
+      listOfExpressions.add(new EqualToExpression(new ColumnExpression(column.getKey(), dataType),
+          new LiteralExpression(column.getValue(), dataType)));
+    }
+    Expression filterExpression = null;
+    if (listOfExpressions.size() > 0) {
+      filterExpression = listOfExpressions.get(0);
+    }
+    for (int i = 1; i < listOfExpressions.size(); i++) {
+      filterExpression = new AndExpression(filterExpression, listOfExpressions.get(i));
+    }
+    return filterExpression;
+  }
+
+  private int getColumnIndex(Field[] fields, String column) {
+    int index = -1;
+    for (Field field : fields) {
+      if (field.getFieldName().equals(column)) {
+        index = field.getSchemaOrdinal();
+        break;
+      }
+    }
+    if (index == -1) {

Review comment:
       @Karan980 Please check this and add test cases for all the missing types as well




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[hidden email]


Reply | Threaded
Open this post in threaded view
|

[GitHub] [carbondata] CarbonDataQA1 commented on pull request #3834: [CARBONDATA-3865] Implementation of delete/update feature in carbondata SDK.

GitBox
In reply to this post by GitBox

CarbonDataQA1 commented on pull request #3834:
URL: https://github.com/apache/carbondata/pull/3834#issuecomment-683593554


   Build Failed  with Spark 2.4.5, Please check CI http://121.244.95.60:12545/job/ApacheCarbon_PR_Builder_2.4.5/2185/
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[hidden email]


Reply | Threaded
Open this post in threaded view
|

[GitHub] [carbondata] CarbonDataQA1 commented on pull request #3834: [CARBONDATA-3865] Implementation of delete/update feature in carbondata SDK.

GitBox
In reply to this post by GitBox

CarbonDataQA1 commented on pull request #3834:
URL: https://github.com/apache/carbondata/pull/3834#issuecomment-683594434


   Build Failed  with Spark 2.3.4, Please check CI http://121.244.95.60:12545/job/ApacheCarbonPRBuilder2.3/3926/
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[hidden email]


Reply | Threaded
Open this post in threaded view
|

[GitHub] [carbondata] CarbonDataQA1 commented on pull request #3834: [CARBONDATA-3865] Implementation of delete/update feature in carbondata SDK.

GitBox
In reply to this post by GitBox

CarbonDataQA1 commented on pull request #3834:
URL: https://github.com/apache/carbondata/pull/3834#issuecomment-684022546


   Build Success with Spark 2.3.4, Please check CI http://121.244.95.60:12545/job/ApacheCarbonPRBuilder2.3/3937/
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[hidden email]


Reply | Threaded
Open this post in threaded view
|

[GitHub] [carbondata] CarbonDataQA1 commented on pull request #3834: [CARBONDATA-3865] Implementation of delete/update feature in carbondata SDK.

GitBox
In reply to this post by GitBox

CarbonDataQA1 commented on pull request #3834:
URL: https://github.com/apache/carbondata/pull/3834#issuecomment-684031807


   Build Success with Spark 2.4.5, Please check CI http://121.244.95.60:12545/job/ApacheCarbon_PR_Builder_2.4.5/2197/
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[hidden email]


Reply | Threaded
Open this post in threaded view
|

[GitHub] [carbondata] kunal642 commented on pull request #3834: [CARBONDATA-3865] Implementation of delete/update feature in carbondata SDK.

GitBox
In reply to this post by GitBox

kunal642 commented on pull request #3834:
URL: https://github.com/apache/carbondata/pull/3834#issuecomment-688461087


   retest this please


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[hidden email]


Reply | Threaded
Open this post in threaded view
|

[GitHub] [carbondata] Karan-c980 commented on a change in pull request #3834: [CARBONDATA-3865] Implementation of delete/update feature in carbondata SDK.

GitBox
In reply to this post by GitBox

Karan-c980 commented on a change in pull request #3834:
URL: https://github.com/apache/carbondata/pull/3834#discussion_r484537947



##########
File path: hadoop/src/main/java/org/apache/carbondata/hadoop/api/CarbonFileInputFormat.java
##########
@@ -189,16 +196,14 @@ public CarbonTable getOrCreateCarbonTable(Configuration configuration) throws IO
         info.setBlockSize(carbonFile.getLength());
         info.setVersionNumber(split.getVersion().number());
         info.setUseMinMaxForPruning(false);
+        if(allDeleteDeltaFiles.size() != 0) {

Review comment:
       Done




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[hidden email]


Reply | Threaded
Open this post in threaded view
|

[GitHub] [carbondata] Karan-c980 commented on a change in pull request #3834: [CARBONDATA-3865] Implementation of delete/update feature in carbondata SDK.

GitBox
In reply to this post by GitBox

Karan-c980 commented on a change in pull request #3834:
URL: https://github.com/apache/carbondata/pull/3834#discussion_r484538733



##########
File path: hadoop/src/main/java/org/apache/carbondata/hadoop/api/CarbonFileInputFormat.java
##########
@@ -260,7 +265,50 @@ public boolean accept(CarbonFile file) {
         getDataBlocksOfSegment(job, carbonTable, indexFilter, validSegments,
             new ArrayList<Segment>(), new ArrayList<String>());
     numBlocks = dataBlocksOfSegment.size();
-    result.addAll(dataBlocksOfSegment);
-    return result;
+    List<String> allDeleteDeltaFiles = getAllDeleteDeltaFiles(carbonTable.getTablePath());
+    if(allDeleteDeltaFiles.size() > 0) {

Review comment:
       Done

##########
File path: hadoop/src/main/java/org/apache/carbondata/hadoop/api/CarbonFileInputFormat.java
##########
@@ -260,7 +265,50 @@ public boolean accept(CarbonFile file) {
         getDataBlocksOfSegment(job, carbonTable, indexFilter, validSegments,
             new ArrayList<Segment>(), new ArrayList<String>());
     numBlocks = dataBlocksOfSegment.size();
-    result.addAll(dataBlocksOfSegment);
-    return result;
+    List<String> allDeleteDeltaFiles = getAllDeleteDeltaFiles(carbonTable.getTablePath());
+    if(allDeleteDeltaFiles.size() > 0) {
+      for (CarbonInputSplit split : dataBlocksOfSegment) {
+        split.setDeleteDeltaFiles(getDeleteDeltaFiles(split.getFilePath(), allDeleteDeltaFiles));
+      }
+    }
+    return new LinkedList<>(dataBlocksOfSegment);
+  }
+
+  private List<String> getAllDeleteDeltaFiles(String path) {
+    List<String> deltaFiles = null;
+    try (Stream<Path> walk = Files.walk(Paths.get(path))) {
+      deltaFiles = walk.map(x -> x.toString())
+          .filter(f -> f.endsWith(CarbonCommonConstants.DELETE_DELTA_FILE_EXT))
+          .collect(Collectors.toList());
+    } catch (IOException e) {
+      e.printStackTrace();

Review comment:
       Done

##########
File path: hadoop/src/main/java/org/apache/carbondata/hadoop/api/CarbonFileInputFormat.java
##########
@@ -260,7 +265,50 @@ public boolean accept(CarbonFile file) {
         getDataBlocksOfSegment(job, carbonTable, indexFilter, validSegments,
             new ArrayList<Segment>(), new ArrayList<String>());
     numBlocks = dataBlocksOfSegment.size();
-    result.addAll(dataBlocksOfSegment);
-    return result;
+    List<String> allDeleteDeltaFiles = getAllDeleteDeltaFiles(carbonTable.getTablePath());
+    if(allDeleteDeltaFiles.size() > 0) {
+      for (CarbonInputSplit split : dataBlocksOfSegment) {
+        split.setDeleteDeltaFiles(getDeleteDeltaFiles(split.getFilePath(), allDeleteDeltaFiles));
+      }
+    }
+    return new LinkedList<>(dataBlocksOfSegment);
+  }
+
+  private List<String> getAllDeleteDeltaFiles(String path) {
+    List<String> deltaFiles = null;
+    try (Stream<Path> walk = Files.walk(Paths.get(path))) {
+      deltaFiles = walk.map(x -> x.toString())
+          .filter(f -> f.endsWith(CarbonCommonConstants.DELETE_DELTA_FILE_EXT))
+          .collect(Collectors.toList());
+    } catch (IOException e) {
+      e.printStackTrace();
+    }
+    return deltaFiles;
+  }
+
+  private String[] getDeleteDeltaFiles(String segmentPath, List<String> allDeleteDeltaFiles) {
+    ArrayList<String> deleteDeltaFiles = new ArrayList<>();
+    String[] pathElements = segmentPath.split(CarbonCommonConstants.FILE_SEPARATOR);

Review comment:
       Done

##########
File path: hadoop/src/main/java/org/apache/carbondata/hadoop/api/CarbonFileInputFormat.java
##########
@@ -260,7 +265,50 @@ public boolean accept(CarbonFile file) {
         getDataBlocksOfSegment(job, carbonTable, indexFilter, validSegments,
             new ArrayList<Segment>(), new ArrayList<String>());
     numBlocks = dataBlocksOfSegment.size();
-    result.addAll(dataBlocksOfSegment);
-    return result;
+    List<String> allDeleteDeltaFiles = getAllDeleteDeltaFiles(carbonTable.getTablePath());
+    if(allDeleteDeltaFiles.size() > 0) {
+      for (CarbonInputSplit split : dataBlocksOfSegment) {
+        split.setDeleteDeltaFiles(getDeleteDeltaFiles(split.getFilePath(), allDeleteDeltaFiles));
+      }
+    }
+    return new LinkedList<>(dataBlocksOfSegment);
+  }
+
+  private List<String> getAllDeleteDeltaFiles(String path) {
+    List<String> deltaFiles = null;
+    try (Stream<Path> walk = Files.walk(Paths.get(path))) {
+      deltaFiles = walk.map(x -> x.toString())
+          .filter(f -> f.endsWith(CarbonCommonConstants.DELETE_DELTA_FILE_EXT))
+          .collect(Collectors.toList());
+    } catch (IOException e) {
+      e.printStackTrace();
+    }
+    return deltaFiles;
+  }
+
+  private String[] getDeleteDeltaFiles(String segmentPath, List<String> allDeleteDeltaFiles) {
+    ArrayList<String> deleteDeltaFiles = new ArrayList<>();
+    String[] pathElements = segmentPath.split(CarbonCommonConstants.FILE_SEPARATOR);
+    String segmentFileName = pathElements[pathElements.length - 1];

Review comment:
       Done

##########
File path: hadoop/src/main/java/org/apache/carbondata/hadoop/api/CarbonFileInputFormat.java
##########
@@ -260,7 +265,50 @@ public boolean accept(CarbonFile file) {
         getDataBlocksOfSegment(job, carbonTable, indexFilter, validSegments,
             new ArrayList<Segment>(), new ArrayList<String>());
     numBlocks = dataBlocksOfSegment.size();
-    result.addAll(dataBlocksOfSegment);
-    return result;
+    List<String> allDeleteDeltaFiles = getAllDeleteDeltaFiles(carbonTable.getTablePath());
+    if(allDeleteDeltaFiles.size() > 0) {
+      for (CarbonInputSplit split : dataBlocksOfSegment) {
+        split.setDeleteDeltaFiles(getDeleteDeltaFiles(split.getFilePath(), allDeleteDeltaFiles));
+      }
+    }
+    return new LinkedList<>(dataBlocksOfSegment);
+  }
+
+  private List<String> getAllDeleteDeltaFiles(String path) {
+    List<String> deltaFiles = null;
+    try (Stream<Path> walk = Files.walk(Paths.get(path))) {
+      deltaFiles = walk.map(x -> x.toString())
+          .filter(f -> f.endsWith(CarbonCommonConstants.DELETE_DELTA_FILE_EXT))
+          .collect(Collectors.toList());
+    } catch (IOException e) {
+      e.printStackTrace();
+    }
+    return deltaFiles;
+  }
+
+  private String[] getDeleteDeltaFiles(String segmentPath, List<String> allDeleteDeltaFiles) {
+    ArrayList<String> deleteDeltaFiles = new ArrayList<>();
+    String[] pathElements = segmentPath.split(CarbonCommonConstants.FILE_SEPARATOR);
+    String segmentFileName = pathElements[pathElements.length - 1];
+    String ExpectedDeleteDeltaFileName = segmentFileName
+        .substring(segmentFileName.indexOf(CarbonCommonConstants.HYPHEN) + 1,
+            segmentFileName.indexOf(CarbonCommonConstants.UNDERSCORE));
+
+    for (String deltaFile : allDeleteDeltaFiles) {
+      String[] deleteDeltapathElements =
+          deltaFile.split(CarbonCommonConstants.FILE_SEPARATOR);

Review comment:
       Done




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[hidden email]


Reply | Threaded
Open this post in threaded view
|

[GitHub] [carbondata] Karan-c980 commented on a change in pull request #3834: [CARBONDATA-3865] Implementation of delete/update feature in carbondata SDK.

GitBox
In reply to this post by GitBox

Karan-c980 commented on a change in pull request #3834:
URL: https://github.com/apache/carbondata/pull/3834#discussion_r484539275



##########
File path: hadoop/src/main/java/org/apache/carbondata/hadoop/api/CarbonFileInputFormat.java
##########
@@ -260,7 +265,50 @@ public boolean accept(CarbonFile file) {
         getDataBlocksOfSegment(job, carbonTable, indexFilter, validSegments,
             new ArrayList<Segment>(), new ArrayList<String>());
     numBlocks = dataBlocksOfSegment.size();
-    result.addAll(dataBlocksOfSegment);
-    return result;
+    List<String> allDeleteDeltaFiles = getAllDeleteDeltaFiles(carbonTable.getTablePath());
+    if(allDeleteDeltaFiles.size() > 0) {
+      for (CarbonInputSplit split : dataBlocksOfSegment) {
+        split.setDeleteDeltaFiles(getDeleteDeltaFiles(split.getFilePath(), allDeleteDeltaFiles));
+      }
+    }
+    return new LinkedList<>(dataBlocksOfSegment);
+  }
+
+  private List<String> getAllDeleteDeltaFiles(String path) {
+    List<String> deltaFiles = null;
+    try (Stream<Path> walk = Files.walk(Paths.get(path))) {
+      deltaFiles = walk.map(x -> x.toString())
+          .filter(f -> f.endsWith(CarbonCommonConstants.DELETE_DELTA_FILE_EXT))
+          .collect(Collectors.toList());
+    } catch (IOException e) {
+      e.printStackTrace();
+    }
+    return deltaFiles;
+  }
+
+  private String[] getDeleteDeltaFiles(String segmentPath, List<String> allDeleteDeltaFiles) {
+    ArrayList<String> deleteDeltaFiles = new ArrayList<>();
+    String[] pathElements = segmentPath.split(CarbonCommonConstants.FILE_SEPARATOR);
+    String segmentFileName = pathElements[pathElements.length - 1];
+    String ExpectedDeleteDeltaFileName = segmentFileName
+        .substring(segmentFileName.indexOf(CarbonCommonConstants.HYPHEN) + 1,
+            segmentFileName.indexOf(CarbonCommonConstants.UNDERSCORE));
+
+    for (String deltaFile : allDeleteDeltaFiles) {
+      String[] deleteDeltapathElements =
+          deltaFile.split(CarbonCommonConstants.FILE_SEPARATOR);
+      String deleteDeltaFullFileName = deleteDeltapathElements[deleteDeltapathElements.length - 1];
+      String deleteDeltaFileName = deleteDeltaFullFileName
+          .substring(0, deleteDeltaFullFileName.indexOf(CarbonCommonConstants.UNDERSCORE));

Review comment:
       Handled

##########
File path: hadoop/src/main/java/org/apache/carbondata/hadoop/api/CarbonTableOutputFormat.java
##########
@@ -559,4 +564,50 @@ public void close(TaskAttemptContext taskAttemptContext) throws InterruptedExcep
       super.close(taskAttemptContext);
     }
   }
+
+  public static RecordWriter<NullWritable, ObjectArrayWritable> getDeleteDeltaRecordWriter(String path) {
+    return (new RecordWriter<NullWritable, ObjectArrayWritable>() {
+      private final ArrayList<String> tupleId = new ArrayList<>();
+
+      @Override
+      public void write(NullWritable aVoid, ObjectArrayWritable objects) {
+        this.tupleId.add((String) objects.get()[0]);
+      }
+
+      @Override
+      public void close(TaskAttemptContext taskAttemptContext) throws IOException {
+        Map<String, DeleteDeltaBlockDetails> blockToDeleteDeltaBlockMapping = new HashMap<>();
+        DeleteDeltaBlockDetails blockDetails;
+        String blockName;
+        for (String tuple : tupleId) {
+          blockName = CarbonUpdateUtil.getBlockName(
+              (tuple.split(CarbonCommonConstants.FILE_SEPARATOR)[TupleIdEnum.BLOCK_ID

Review comment:
       Done

##########
File path: hadoop/src/main/java/org/apache/carbondata/hadoop/api/CarbonTableOutputFormat.java
##########
@@ -559,4 +564,50 @@ public void close(TaskAttemptContext taskAttemptContext) throws InterruptedExcep
       super.close(taskAttemptContext);
     }
   }
+
+  public static RecordWriter<NullWritable, ObjectArrayWritable> getDeleteDeltaRecordWriter(String path) {
+    return (new RecordWriter<NullWritable, ObjectArrayWritable>() {
+      private final ArrayList<String> tupleId = new ArrayList<>();
+
+      @Override
+      public void write(NullWritable aVoid, ObjectArrayWritable objects) {
+        this.tupleId.add((String) objects.get()[0]);
+      }
+
+      @Override
+      public void close(TaskAttemptContext taskAttemptContext) throws IOException {
+        Map<String, DeleteDeltaBlockDetails> blockToDeleteDeltaBlockMapping = new HashMap<>();
+        DeleteDeltaBlockDetails blockDetails;
+        String blockName;
+        for (String tuple : tupleId) {
+          blockName = CarbonUpdateUtil.getBlockName(
+              (tuple.split(CarbonCommonConstants.FILE_SEPARATOR)[TupleIdEnum.BLOCK_ID
+                  .getTupleIdIndex()]));
+
+          if (!blockToDeleteDeltaBlockMapping.containsKey(blockName)) {
+            blockDetails = new DeleteDeltaBlockDetails(blockName);
+            blockToDeleteDeltaBlockMapping.put(blockName, blockDetails);
+          }
+          blockDetails = blockToDeleteDeltaBlockMapping.get(blockName);
+          try {
+            blockDetails.addBlocklet(
+                CarbonUpdateUtil.getRequiredFieldFromTID(tuple, TupleIdEnum.BLOCKLET_ID),
+                CarbonUpdateUtil.getRequiredFieldFromTID(tuple, TupleIdEnum.OFFSET), Integer
+                    .parseInt(
+                        CarbonUpdateUtil.getRequiredFieldFromTID(tuple, TupleIdEnum.PAGE_ID)));
+          } catch (Exception e) {
+            e.printStackTrace();

Review comment:
       Done




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[hidden email]


Reply | Threaded
Open this post in threaded view
|

[GitHub] [carbondata] Karan-c980 commented on a change in pull request #3834: [CARBONDATA-3865] Implementation of delete/update feature in carbondata SDK.

GitBox
In reply to this post by GitBox

Karan-c980 commented on a change in pull request #3834:
URL: https://github.com/apache/carbondata/pull/3834#discussion_r484539366



##########
File path: sdk/sdk/src/main/java/org/apache/carbondata/sdk/file/CarbonIUD.java
##########
@@ -0,0 +1,275 @@
+package org.apache.carbondata.sdk.file;
+
+import org.apache.carbondata.common.exceptions.sql.InvalidLoadOptionException;
+import org.apache.carbondata.core.constants.CarbonCommonConstants;
+import org.apache.carbondata.core.datastore.impl.FileFactory;
+import org.apache.carbondata.core.metadata.datatype.DataType;
+import org.apache.carbondata.core.metadata.datatype.Field;
+import org.apache.carbondata.core.scan.expression.ColumnExpression;
+import org.apache.carbondata.core.scan.expression.Expression;
+import org.apache.carbondata.core.scan.expression.LiteralExpression;
+import org.apache.carbondata.core.scan.expression.conditional.EqualToExpression;
+import org.apache.carbondata.core.scan.expression.logical.AndExpression;
+import org.apache.carbondata.hadoop.api.CarbonTableOutputFormat;
+import org.apache.carbondata.hadoop.internal.ObjectArrayWritable;
+
+import java.io.File;
+import java.io.FilenameFilter;
+import java.io.IOException;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+
+import com.jcraft.jsch.IO;
+import org.apache.hadoop.io.NullWritable;
+import org.apache.hadoop.mapred.JobConf;
+import org.apache.hadoop.mapreduce.Job;
+import org.apache.hadoop.mapreduce.RecordWriter;
+import org.apache.hadoop.mapreduce.TaskAttemptID;
+import org.apache.hadoop.mapreduce.task.TaskAttemptContextImpl;
+
+public class CarbonIUD {
+
+  private final Map<String, Map<String, String>> filterColumnToValueMapping;
+  private final Map<String, Map<String, String>> updateColumnToValueMapping;
+
+  private CarbonIUD() {
+    filterColumnToValueMapping = new HashMap<>();
+    updateColumnToValueMapping = new HashMap<>();
+  }
+
+  /**
+   * @return CarbonIUD object
+   */
+  public static CarbonIUD getInstance() {
+    return new CarbonIUD();
+  }
+
+  /**
+   * @param path   is the segment path on which delete is performed
+   * @param column is the columeName on which records have to be deleted
+   * @param value  of column on which the records have to be deleted
+   * @return CarbonIUD object
+   * <p>
+   * for eg: DELETE WHERE column = value
+   */
+  public CarbonIUD delete(String path, String column, String value) {
+    prepareDelete(path, column, value);
+    return this;
+  }
+
+  /**
+   * @param path             is the segment path on which delete is performed
+   * @param filterExpression is the expression to delete the records
+   * @throws IOException
+   * @throws InterruptedException
+   */
+  public void delete(String path, Expression filterExpression)
+      throws IOException, InterruptedException {
+    CarbonReader reader = CarbonReader.builder(path, "_temp")

Review comment:
       Removed table name




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[hidden email]


Reply | Threaded
Open this post in threaded view
|

[GitHub] [carbondata] Karan-c980 commented on a change in pull request #3834: [CARBONDATA-3865] Implementation of delete/update feature in carbondata SDK.

GitBox
In reply to this post by GitBox

Karan-c980 commented on a change in pull request #3834:
URL: https://github.com/apache/carbondata/pull/3834#discussion_r484539395



##########
File path: sdk/sdk/src/main/java/org/apache/carbondata/sdk/file/CarbonIUD.java
##########
@@ -0,0 +1,275 @@
+package org.apache.carbondata.sdk.file;
+
+import org.apache.carbondata.common.exceptions.sql.InvalidLoadOptionException;
+import org.apache.carbondata.core.constants.CarbonCommonConstants;
+import org.apache.carbondata.core.datastore.impl.FileFactory;
+import org.apache.carbondata.core.metadata.datatype.DataType;
+import org.apache.carbondata.core.metadata.datatype.Field;
+import org.apache.carbondata.core.scan.expression.ColumnExpression;
+import org.apache.carbondata.core.scan.expression.Expression;
+import org.apache.carbondata.core.scan.expression.LiteralExpression;
+import org.apache.carbondata.core.scan.expression.conditional.EqualToExpression;
+import org.apache.carbondata.core.scan.expression.logical.AndExpression;
+import org.apache.carbondata.hadoop.api.CarbonTableOutputFormat;
+import org.apache.carbondata.hadoop.internal.ObjectArrayWritable;
+
+import java.io.File;
+import java.io.FilenameFilter;
+import java.io.IOException;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+
+import com.jcraft.jsch.IO;
+import org.apache.hadoop.io.NullWritable;
+import org.apache.hadoop.mapred.JobConf;
+import org.apache.hadoop.mapreduce.Job;
+import org.apache.hadoop.mapreduce.RecordWriter;
+import org.apache.hadoop.mapreduce.TaskAttemptID;
+import org.apache.hadoop.mapreduce.task.TaskAttemptContextImpl;
+
+public class CarbonIUD {
+
+  private final Map<String, Map<String, String>> filterColumnToValueMapping;
+  private final Map<String, Map<String, String>> updateColumnToValueMapping;
+
+  private CarbonIUD() {
+    filterColumnToValueMapping = new HashMap<>();
+    updateColumnToValueMapping = new HashMap<>();
+  }
+
+  /**
+   * @return CarbonIUD object
+   */
+  public static CarbonIUD getInstance() {
+    return new CarbonIUD();
+  }
+
+  /**
+   * @param path   is the segment path on which delete is performed
+   * @param column is the columeName on which records have to be deleted
+   * @param value  of column on which the records have to be deleted
+   * @return CarbonIUD object
+   * <p>
+   * for eg: DELETE WHERE column = value
+   */
+  public CarbonIUD delete(String path, String column, String value) {
+    prepareDelete(path, column, value);
+    return this;
+  }
+
+  /**
+   * @param path             is the segment path on which delete is performed
+   * @param filterExpression is the expression to delete the records
+   * @throws IOException
+   * @throws InterruptedException
+   */
+  public void delete(String path, Expression filterExpression)
+      throws IOException, InterruptedException {
+    CarbonReader reader = CarbonReader.builder(path, "_temp")
+        .projection(new String[] { CarbonCommonConstants.CARBON_IMPLICIT_COLUMN_TUPLEID })
+        .filter(filterExpression).build();
+
+    RecordWriter<NullWritable, ObjectArrayWritable> deleteDeltaWriter =
+        CarbonTableOutputFormat.getDeleteDeltaRecordWriter(path);
+    ObjectArrayWritable writable = new ObjectArrayWritable();
+    while (reader.hasNext()) {
+      Object[] row = (Object[]) reader.readNextRow();
+      writable.set(row);
+      deleteDeltaWriter.write(NullWritable.get(), writable);
+    }
+    deleteDeltaWriter.close(null);
+    reader.close();
+  }
+
+  /**
+   * Calling this method will start the execution of delete process
+   * @throws IOException
+   * @throws InterruptedException
+   */
+  public void closeDelete() throws IOException, InterruptedException {
+    for (Map.Entry<String, Map<String, String>> path : this.filterColumnToValueMapping.entrySet()) {
+      deleteExecution(path.getKey());
+    }
+    this.filterColumnToValueMapping.clear();
+  }
+
+  /**
+   * @param path      is the segment path on which update is performed
+   * @param column    is the columeName on which records have to be updated
+   * @param value     of column on which the records have to be updated
+   * @param updColumn is the name of updatedColumn
+   * @param updValue  is the value of updatedColumn
+   * @return CarbonUID
+   * <p>
+   * for eg: UPDATE updColumn = updValue WHERE column = value
+   */
+  public CarbonIUD update(String path, String column, String value, String updColumn,
+      String updValue) {
+    prepareUpdate(path, column, value, updColumn, updValue);
+    return this;
+  }
+
+  /**
+   * @param path                        is the segment path on which update is performed.
+   * @param filterExpression            is the expression object to update the records
+   * @param updatedColumnToValueMapping contains the mapping of updatedColumns to updatedValues
+   * @throws IOException
+   * @throws InterruptedException
+   * @throws InvalidLoadOptionException
+   */
+  public void update(String path, Expression filterExpression,
+      Map<String, String> updatedColumnToValueMapping)
+      throws IOException, InterruptedException, InvalidLoadOptionException {
+    File[] indexFiles = getCarbonIndexFile(path);
+    Schema schema = CarbonSchemaReader.readSchema(indexFiles[0].getAbsolutePath()).asOriginOrder();
+    Field[] fields = schema.getFields();
+    String[] projectionColumns = new String[fields.length + 1];
+    for (int i = 0; i < fields.length; i++) {
+      projectionColumns[i] = (fields[i].getFieldName());
+    }
+    projectionColumns[projectionColumns.length - 1] =
+        CarbonCommonConstants.CARBON_IMPLICIT_COLUMN_TUPLEID;
+
+    CarbonWriter writer =
+        CarbonWriter.builder().outputPath(path).withCsvInput(schema).writtenBy("CarbonIUD").build();
+
+    CarbonReader reader =
+        CarbonReader.builder(path, "_temp").projection(projectionColumns).filter(filterExpression)
+            .build();
+
+    RecordWriter<NullWritable, ObjectArrayWritable> deleteDeltaWriter =
+        CarbonTableOutputFormat.getDeleteDeltaRecordWriter(path);
+    ObjectArrayWritable writable = new ObjectArrayWritable();
+
+    while (reader.hasNext()) {
+      Object[] row = (Object[]) reader.readNextRow();
+      writable.set(Arrays.copyOfRange(row, row.length - 1, row.length));
+      for (Map.Entry<String, String> column : updatedColumnToValueMapping.entrySet()) {
+        row[getColumnIndex(fields, column.getKey())] = column.getValue();
+      }
+      writer.write(Arrays.copyOfRange(row, 0, row.length - 1));
+      deleteDeltaWriter.write(NullWritable.get(), writable);
+    }
+    deleteDeltaWriter.close(null);
+    writer.close();
+    reader.close();
+  }
+
+  /**
+   * Calling this method will start the execution of update process
+   * @throws IOException
+   * @throws InterruptedException
+   * @throws InvalidLoadOptionException
+   */
+  public void closeUpdate() throws IOException, InterruptedException, InvalidLoadOptionException {
+    for (Map.Entry<String, Map<String, String>> path : this.filterColumnToValueMapping.entrySet()) {
+      if (this.updateColumnToValueMapping.containsKey(path.getKey())) {
+        updateExecution(path.getKey());
+      }
+    }
+    this.filterColumnToValueMapping.clear();
+    this.updateColumnToValueMapping.clear();
+  }
+
+  private void updateExecution(String path)
+      throws IOException, InterruptedException, InvalidLoadOptionException {
+    Expression filterExpression = getExpression(path);
+    update(path, filterExpression, this.updateColumnToValueMapping.get(path));
+  }
+
+  private void deleteExecution(String path) throws IOException, InterruptedException {
+    Expression filterExpression = getExpression(path);
+    delete(path, filterExpression);
+  }
+
+  private void prepareUpdate(String path, String column, String value, String updColumn,
+      String updValue) {
+    prepareDelete(path, column, value);
+    updColumn = updColumn.toLowerCase().trim();
+    if (this.updateColumnToValueMapping.containsKey(path)) {
+      this.updateColumnToValueMapping.get(path).put(updColumn, updValue);
+    } else {
+      Map<String, String> columnToValue = new HashMap<>();
+      columnToValue.put(updColumn, updValue);
+      this.updateColumnToValueMapping.put(path, columnToValue);
+    }
+  }
+
+  private void prepareDelete(String path, String column, String value) {
+    column = column.toLowerCase().trim();
+    if (this.filterColumnToValueMapping.containsKey(path)) {
+      this.filterColumnToValueMapping.get(path).put(column, value);
+    } else {
+      Map<String, String> columnToValueMapping = new HashMap<>();
+      columnToValueMapping.put(column, value);
+      this.filterColumnToValueMapping.put(path, columnToValueMapping);
+    }
+  }
+
+  private Expression getExpression(String path) throws IOException {
+    File[] indexFiles = getCarbonIndexFile(path);
+    Schema schema = CarbonSchemaReader.readSchema(indexFiles[0].getAbsolutePath()).asOriginOrder();
+    Field[] fields = schema.getFields();
+    List<EqualToExpression> listOfExpressions = new ArrayList<>();
+    for (Map.Entry<String, String> column : this.filterColumnToValueMapping.get(path).entrySet()) {
+      DataType dataType = getColumnDataType(fields, column.getKey());
+      listOfExpressions.add(new EqualToExpression(new ColumnExpression(column.getKey(), dataType),
+          new LiteralExpression(column.getValue(), dataType)));
+    }
+    Expression filterExpression = null;
+    if (listOfExpressions.size() > 0) {
+      filterExpression = listOfExpressions.get(0);
+    }
+    for (int i = 1; i < listOfExpressions.size(); i++) {
+      filterExpression = new AndExpression(filterExpression, listOfExpressions.get(i));
+    }
+    return filterExpression;
+  }
+
+  private int getColumnIndex(Field[] fields, String column) {
+    int index = -1;
+    for (Field field : fields) {
+      if (field.getFieldName().equals(column)) {
+        index = field.getSchemaOrdinal();
+        break;
+      }
+    }
+    if (index == -1) {

Review comment:
       Done




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[hidden email]


Reply | Threaded
Open this post in threaded view
|

[GitHub] [carbondata] kunal642 commented on a change in pull request #3834: [CARBONDATA-3865] Implementation of delete/update feature in carbondata SDK.

GitBox
In reply to this post by GitBox

kunal642 commented on a change in pull request #3834:
URL: https://github.com/apache/carbondata/pull/3834#discussion_r484549160



##########
File path: hadoop/src/main/java/org/apache/carbondata/hadoop/api/CarbonFileInputFormat.java
##########
@@ -239,6 +255,61 @@ public void setAllColumnProjectionIfNotConfigured(JobContext job, CarbonTable ca
     List<CarbonInputSplit> dataBlocksOfSegment = getDataBlocksOfSegment(job, carbonTable,
         indexFilter, validSegments, new ArrayList<>(), new ArrayList<>());
     numBlocks = dataBlocksOfSegment.size();
+    List<String> allDeleteDeltaFiles = getAllDeleteDeltaFiles(carbonTable.getTablePath());
+    if (CollectionUtils.isNotEmpty(allDeleteDeltaFiles)) {
+      for (CarbonInputSplit split : dataBlocksOfSegment) {
+        split.setDeleteDeltaFiles(getDeleteDeltaFiles(split.getFilePath(), allDeleteDeltaFiles));
+      }
+    }
     return new LinkedList<>(dataBlocksOfSegment);
   }
+
+  private List<String> getAllDeleteDeltaFiles(String path) {
+    List<String> deltaFiles = null;
+    try (Stream<Path> walk = Files.walk(Paths.get(path))) {
+      deltaFiles = walk.map(x -> x.toString())
+          .filter(f -> f.endsWith(CarbonCommonConstants.DELETE_DELTA_FILE_EXT))
+          .collect(Collectors.toList());
+    } catch (IOException e) {
+      throw new RuntimeException(e);
+    }
+    return deltaFiles;
+  }
+
+  private String[] getDeleteDeltaFiles(String segmentPath, List<String> allDeleteDeltaFiles) {

Review comment:
       change variable name from segmentPath to tablePath




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[hidden email]


Reply | Threaded
Open this post in threaded view
|

[GitHub] [carbondata] CarbonDataQA1 commented on pull request #3834: [CARBONDATA-3865] Implementation of delete/update feature in carbondata SDK.

GitBox
In reply to this post by GitBox

CarbonDataQA1 commented on pull request #3834:
URL: https://github.com/apache/carbondata/pull/3834#issuecomment-688494345


   Build Success with Spark 2.3.4, Please check CI http://121.244.95.60:12545/job/ApacheCarbonPRBuilder2.3/3994/
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[hidden email]


Reply | Threaded
Open this post in threaded view
|

[GitHub] [carbondata] CarbonDataQA1 commented on pull request #3834: [CARBONDATA-3865] Implementation of delete/update feature in carbondata SDK.

GitBox
In reply to this post by GitBox

CarbonDataQA1 commented on pull request #3834:
URL: https://github.com/apache/carbondata/pull/3834#issuecomment-688495259


   Build Success with Spark 2.4.5, Please check CI http://121.244.95.60:12545/job/ApacheCarbon_PR_Builder_2.4.5/2254/
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[hidden email]


Reply | Threaded
Open this post in threaded view
|

[GitHub] [carbondata] xubo245 commented on pull request #3834: [CARBONDATA-3865] Implementation of delete/update feature in carbondata SDK.

GitBox
In reply to this post by GitBox

xubo245 commented on pull request #3834:
URL: https://github.com/apache/carbondata/pull/3834#issuecomment-688564092


   LGTM


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[hidden email]


Reply | Threaded
Open this post in threaded view
|

[GitHub] [carbondata] kunal642 commented on a change in pull request #3834: [CARBONDATA-3865] Implementation of delete/update feature in carbondata SDK.

GitBox
In reply to this post by GitBox

kunal642 commented on a change in pull request #3834:
URL: https://github.com/apache/carbondata/pull/3834#discussion_r484549160



##########
File path: hadoop/src/main/java/org/apache/carbondata/hadoop/api/CarbonFileInputFormat.java
##########
@@ -239,6 +255,61 @@ public void setAllColumnProjectionIfNotConfigured(JobContext job, CarbonTable ca
     List<CarbonInputSplit> dataBlocksOfSegment = getDataBlocksOfSegment(job, carbonTable,
         indexFilter, validSegments, new ArrayList<>(), new ArrayList<>());
     numBlocks = dataBlocksOfSegment.size();
+    List<String> allDeleteDeltaFiles = getAllDeleteDeltaFiles(carbonTable.getTablePath());
+    if (CollectionUtils.isNotEmpty(allDeleteDeltaFiles)) {
+      for (CarbonInputSplit split : dataBlocksOfSegment) {
+        split.setDeleteDeltaFiles(getDeleteDeltaFiles(split.getFilePath(), allDeleteDeltaFiles));
+      }
+    }
     return new LinkedList<>(dataBlocksOfSegment);
   }
+
+  private List<String> getAllDeleteDeltaFiles(String path) {
+    List<String> deltaFiles = null;
+    try (Stream<Path> walk = Files.walk(Paths.get(path))) {
+      deltaFiles = walk.map(x -> x.toString())
+          .filter(f -> f.endsWith(CarbonCommonConstants.DELETE_DELTA_FILE_EXT))
+          .collect(Collectors.toList());
+    } catch (IOException e) {
+      throw new RuntimeException(e);
+    }
+    return deltaFiles;
+  }
+
+  private String[] getDeleteDeltaFiles(String segmentPath, List<String> allDeleteDeltaFiles) {

Review comment:
       change variable name from segmentPath to tablePath




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[hidden email]


Reply | Threaded
Open this post in threaded view
|

[GitHub] [carbondata] CarbonDataQA1 commented on pull request #3834: [CARBONDATA-3865] Implementation of delete/update feature in carbondata SDK.

GitBox
In reply to this post by GitBox

CarbonDataQA1 commented on pull request #3834:
URL: https://github.com/apache/carbondata/pull/3834#issuecomment-688694994


   Build Success with Spark 2.4.5, Please check CI http://121.244.95.60:12545/job/ApacheCarbon_PR_Builder_2.4.5/2257/
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[hidden email]


Reply | Threaded
Open this post in threaded view
|

[GitHub] [carbondata] CarbonDataQA1 commented on pull request #3834: [CARBONDATA-3865] Implementation of delete/update feature in carbondata SDK.

GitBox
In reply to this post by GitBox

CarbonDataQA1 commented on pull request #3834:
URL: https://github.com/apache/carbondata/pull/3834#issuecomment-688698823


   Build Success with Spark 2.3.4, Please check CI http://121.244.95.60:12545/job/ApacheCarbonPRBuilder2.3/3997/
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[hidden email]


Reply | Threaded
Open this post in threaded view
|

[GitHub] [carbondata] Karan-c980 commented on a change in pull request #3834: [CARBONDATA-3865] Implementation of delete/update feature in carbondata SDK.

GitBox
In reply to this post by GitBox

Karan-c980 commented on a change in pull request #3834:
URL: https://github.com/apache/carbondata/pull/3834#discussion_r485093341



##########
File path: README.md
##########
@@ -100,3 +100,4 @@ To get involved in CarbonData:
 ## About
 Apache CarbonData is an open source project of The Apache Software Foundation (ASF).
 
+## PR

Review comment:
       Removed




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[hidden email]


12345