GitHub user jackylk opened a pull request:
https://github.com/apache/carbondata/pull/1240 [CARBONDATA-1365] add RLE codec implemen Be sure to do all of the following to help us incorporate your contribution quickly and easily: - [ ] Make sure the PR title is formatted like: `[CARBONDATA-<Jira issue #>] Description of pull request` - [ ] Make sure tests pass via `mvn clean verify`. (Even better, enable Travis-CI on your fork and ensure the whole test matrix passes). - [ ] Replace `<Jira issue #>` in the title with the actual Jira issue number, if there is one. - [ ] If this contribution is large, please file an Apache [Individual Contributor License Agreement](https://www.apache.org/licenses/icla.txt). - [ ] Testing done Please provide details on - Whether new unit test cases have been added or why no new tests are required? - What manual testing you have done? - Any additional information to help reviewers in testing this change. - [ ] For large changes, please consider breaking it into sub-tasks under an umbrella JIRA. --- You can merge this pull request into a Git repository by running: $ git pull https://github.com/jackylk/incubator-carbondata int_rle Alternatively you can review and apply these changes as the patch at: https://github.com/apache/carbondata/pull/1240.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #1240 ---- commit 924c43bb8856560350722e47aa60a3a64634fecf Author: Jacky Li <[hidden email]> Date: 2017-08-06T16:02:13Z add RLE codec ---- --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at [hidden email] or file a JIRA ticket with INFRA. --- |
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1240 Build Failed with Spark 1.6, Please check CI http://144.76.159.231:8080/job/ApacheCarbonPRBuilder/799/ --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at [hidden email] or file a JIRA ticket with INFRA. --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1240 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder/3397/ --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at [hidden email] or file a JIRA ticket with INFRA. --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1240 SDV Build Success with Spark 2.1, Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/126/ --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at [hidden email] or file a JIRA ticket with INFRA. --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1240 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder/3398/ --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at [hidden email] or file a JIRA ticket with INFRA. --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1240 SDV Build Failed with Spark 2.1, Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/127/ --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at [hidden email] or file a JIRA ticket with INFRA. --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/1240 Build Success with Spark 1.6, Please check CI http://144.76.159.231:8080/job/ApacheCarbonPRBuilder/800/ --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at [hidden email] or file a JIRA ticket with INFRA. --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/1240 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder/3401/ --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at [hidden email] or file a JIRA ticket with INFRA. --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/1240 Build Success with Spark 1.6, Please check CI http://144.76.159.231:8080/job/ApacheCarbonPRBuilder/803/ --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at [hidden email] or file a JIRA ticket with INFRA. --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/1240 SDV Build Success with Spark 2.1, Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/130/ --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at [hidden email] or file a JIRA ticket with INFRA. --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1240#discussion_r132898313 --- Diff: core/src/main/java/org/apache/carbondata/core/datastore/page/encoding/RLECodec.java --- @@ -0,0 +1,417 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package org.apache.carbondata.core.datastore.page.encoding; + +import java.io.ByteArrayInputStream; +import java.io.ByteArrayOutputStream; +import java.io.DataInputStream; +import java.io.DataOutputStream; +import java.io.IOException; +import java.util.ArrayList; +import java.util.List; + +import org.apache.carbondata.core.datastore.page.ColumnPage; +import org.apache.carbondata.core.datastore.page.ComplexColumnPage; +import org.apache.carbondata.core.datastore.page.statistics.SimpleStatsResult; +import org.apache.carbondata.core.memory.MemoryException; +import org.apache.carbondata.core.metadata.CodecMetaFactory; +import org.apache.carbondata.core.metadata.datatype.DataType; + +/** + * RLE encoding implementation for integral column page. + * This encoding keeps track of repeated-run and non-repeated-run, and make use + * of the highest bit of the length field to indicate the type of run. + * The length field is encoded as 32bits value. + * + * For example: input data {5, 5, 1, 2, 3, 3, 3, 3, 3} will be encoded to + * {0x00, 0x00, 0x00, 0x02, 0x05, (repeated-run, 2 values of 5) + * 0x80, 0x00, 0x00, 0x03, 0x01, 0x02, 0x03, (non-repeated-run, 3 values: 1, 2, 3) + * 0x00, 0x00, 0x00, 0x04, 0x03} (repeated-run, 4 values of 3) + */ +public class RLECodec implements ColumnPageCodec { + + enum RUN_STATE { INIT, START, REPEATED_RUN, NONREPEATED_RUN } + + private DataType dataType; + private int pageSize; + + /** + * New RLECodec + * @param dataType data type of the raw column page before encode + * @param pageSize page size of the raw column page before encode + */ + RLECodec(DataType dataType, int pageSize) { + this.dataType = dataType; + this.pageSize = pageSize; + } + + @Override + public String getName() { + return "RLECodec"; + } + + @Override + public EncodedColumnPage encode(ColumnPage input) throws MemoryException, IOException { + Encoder encoder = new Encoder(); + return encoder.encode(input); + } + + @Override + public EncodedColumnPage[] encodeComplexColumn(ComplexColumnPage input) { + throw new UnsupportedOperationException("complex column does not support RLE encoding"); + } + + @Override + public ColumnPage decode(byte[] input, int offset, int length) throws MemoryException, + IOException { + Decoder decoder = new Decoder(dataType, pageSize); + return decoder.decode(input, offset, length); + } + + // This codec supports integral type only + private void validateDataType(DataType dataType) { + switch (dataType) { + case BYTE: + case SHORT: + case INT: + case LONG: + break; + default: + throw new UnsupportedOperationException(dataType + " is not supported for RLE"); + } + } + + private class Encoder { + // While encoding RLE, this class internally work as a state machine + // INIT state is the initial state before any value comes + // START state is the start for each run + // REPEATED_RUN state means it is collecting repeated values (`lastValue`) + // NONREPEATED_RUN state means it is collecting non-repeated values (`nonRepeatValues`) + private RUN_STATE runState; + + // count for each run, either REPEATED_RUN or NONREPEATED_RUN + private int valueCount; + + // collected value for REPEATED_RUN + private Object lastValue; + + // collected value for NONREPEATED_RUN + private List<Object> nonRepeatValues; + + // data type of input page + private DataType dataType; + + // output stream for encoded data + private ByteArrayOutputStream bao; + private DataOutputStream stream; + + private Encoder() { + this.runState = RUN_STATE.INIT; + this.valueCount = 0; + this.nonRepeatValues = new ArrayList<>(); + this.bao = new ByteArrayOutputStream(); + this.stream = new DataOutputStream(bao); + } + + private EncodedColumnPage encode(ColumnPage input) throws MemoryException, IOException { + validateDataType(input.getDataType()); + this.dataType = input.getDataType(); + switch (dataType) { + case BYTE: + byte[] bytePage = input.getBytePage(); + for (int i = 0; i < bytePage.length; i++) { + putValue(bytePage[i]); + } + break; + case SHORT: + short[] shortPage = input.getShortPage(); + for (int i = 0; i < shortPage.length; i++) { + putValue(shortPage[i]); + } + break; + case INT: + int[] intPage = input.getIntPage(); + for (int i = 0; i < intPage.length; i++) { + putValue(intPage[i]); + } + break; + case LONG: + long[] longPage = input.getLongPage(); + for (int i = 0; i < longPage.length; i++) { + putValue(longPage[i]); + } + break; + default: + throw new UnsupportedOperationException(input.getDataType() + + " does not support RLE encoding"); + } + byte[] encoded = collectResult(); + SimpleStatsResult stats = (SimpleStatsResult) input.getStatistics(); + return new EncodedMeasurePage( + input.getPageSize(), + encoded, + CodecMetaFactory.createMeta(stats, input.getDataType()), + stats.getNullBits()); + } + + private void putValue(Object value) throws IOException { + if (runState == RUN_STATE.INIT) { + startNewRun(value); + } else { + if (lastValue.equals(value)) { + putRepeatValue(value); + } else { + putNonRepeatValue(value); + } + } + } + + // when last row is reached, write out all collected data + private byte[] collectResult() throws IOException { + switch (runState) { + case REPEATED_RUN: + writeRunLength(valueCount); + writeRunValue(lastValue); + break; + case NONREPEATED_RUN: + writeRunLength(valueCount | 0x80000000); --- End diff -- I think writing short is enough as of now. Because we are restricting page size in short only --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at [hidden email] or file a JIRA ticket with INFRA. --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1240#discussion_r132899540 --- Diff: core/src/main/java/org/apache/carbondata/core/datastore/page/encoding/RLECodec.java --- @@ -0,0 +1,417 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package org.apache.carbondata.core.datastore.page.encoding; + +import java.io.ByteArrayInputStream; +import java.io.ByteArrayOutputStream; +import java.io.DataInputStream; +import java.io.DataOutputStream; +import java.io.IOException; +import java.util.ArrayList; +import java.util.List; + +import org.apache.carbondata.core.datastore.page.ColumnPage; +import org.apache.carbondata.core.datastore.page.ComplexColumnPage; +import org.apache.carbondata.core.datastore.page.statistics.SimpleStatsResult; +import org.apache.carbondata.core.memory.MemoryException; +import org.apache.carbondata.core.metadata.CodecMetaFactory; +import org.apache.carbondata.core.metadata.datatype.DataType; + +/** + * RLE encoding implementation for integral column page. + * This encoding keeps track of repeated-run and non-repeated-run, and make use + * of the highest bit of the length field to indicate the type of run. + * The length field is encoded as 32bits value. + * + * For example: input data {5, 5, 1, 2, 3, 3, 3, 3, 3} will be encoded to + * {0x00, 0x00, 0x00, 0x02, 0x05, (repeated-run, 2 values of 5) + * 0x80, 0x00, 0x00, 0x03, 0x01, 0x02, 0x03, (non-repeated-run, 3 values: 1, 2, 3) + * 0x00, 0x00, 0x00, 0x04, 0x03} (repeated-run, 4 values of 3) + */ +public class RLECodec implements ColumnPageCodec { + + enum RUN_STATE { INIT, START, REPEATED_RUN, NONREPEATED_RUN } + + private DataType dataType; + private int pageSize; + + /** + * New RLECodec + * @param dataType data type of the raw column page before encode + * @param pageSize page size of the raw column page before encode + */ + RLECodec(DataType dataType, int pageSize) { + this.dataType = dataType; + this.pageSize = pageSize; + } + + @Override + public String getName() { + return "RLECodec"; + } + + @Override + public EncodedColumnPage encode(ColumnPage input) throws MemoryException, IOException { + Encoder encoder = new Encoder(); + return encoder.encode(input); + } + + @Override + public EncodedColumnPage[] encodeComplexColumn(ComplexColumnPage input) { + throw new UnsupportedOperationException("complex column does not support RLE encoding"); + } + + @Override + public ColumnPage decode(byte[] input, int offset, int length) throws MemoryException, + IOException { + Decoder decoder = new Decoder(dataType, pageSize); + return decoder.decode(input, offset, length); + } + + // This codec supports integral type only + private void validateDataType(DataType dataType) { + switch (dataType) { + case BYTE: + case SHORT: + case INT: + case LONG: --- End diff -- ideally double and decimal also can be supported right with this encoding, why those are not supported --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at [hidden email] or file a JIRA ticket with INFRA. --- |
In reply to this post by qiuchenjian-2
Github user jackylk commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1240#discussion_r133606239 --- Diff: core/src/main/java/org/apache/carbondata/core/datastore/page/encoding/RLECodec.java --- @@ -0,0 +1,417 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package org.apache.carbondata.core.datastore.page.encoding; + +import java.io.ByteArrayInputStream; +import java.io.ByteArrayOutputStream; +import java.io.DataInputStream; +import java.io.DataOutputStream; +import java.io.IOException; +import java.util.ArrayList; +import java.util.List; + +import org.apache.carbondata.core.datastore.page.ColumnPage; +import org.apache.carbondata.core.datastore.page.ComplexColumnPage; +import org.apache.carbondata.core.datastore.page.statistics.SimpleStatsResult; +import org.apache.carbondata.core.memory.MemoryException; +import org.apache.carbondata.core.metadata.CodecMetaFactory; +import org.apache.carbondata.core.metadata.datatype.DataType; + +/** + * RLE encoding implementation for integral column page. + * This encoding keeps track of repeated-run and non-repeated-run, and make use + * of the highest bit of the length field to indicate the type of run. + * The length field is encoded as 32bits value. + * + * For example: input data {5, 5, 1, 2, 3, 3, 3, 3, 3} will be encoded to + * {0x00, 0x00, 0x00, 0x02, 0x05, (repeated-run, 2 values of 5) + * 0x80, 0x00, 0x00, 0x03, 0x01, 0x02, 0x03, (non-repeated-run, 3 values: 1, 2, 3) + * 0x00, 0x00, 0x00, 0x04, 0x03} (repeated-run, 4 values of 3) + */ +public class RLECodec implements ColumnPageCodec { + + enum RUN_STATE { INIT, START, REPEATED_RUN, NONREPEATED_RUN } + + private DataType dataType; + private int pageSize; + + /** + * New RLECodec + * @param dataType data type of the raw column page before encode + * @param pageSize page size of the raw column page before encode + */ + RLECodec(DataType dataType, int pageSize) { + this.dataType = dataType; + this.pageSize = pageSize; + } + + @Override + public String getName() { + return "RLECodec"; + } + + @Override + public EncodedColumnPage encode(ColumnPage input) throws MemoryException, IOException { + Encoder encoder = new Encoder(); + return encoder.encode(input); + } + + @Override + public EncodedColumnPage[] encodeComplexColumn(ComplexColumnPage input) { + throw new UnsupportedOperationException("complex column does not support RLE encoding"); + } + + @Override + public ColumnPage decode(byte[] input, int offset, int length) throws MemoryException, + IOException { + Decoder decoder = new Decoder(dataType, pageSize); + return decoder.decode(input, offset, length); + } + + // This codec supports integral type only + private void validateDataType(DataType dataType) { + switch (dataType) { + case BYTE: + case SHORT: + case INT: + case LONG: + break; + default: + throw new UnsupportedOperationException(dataType + " is not supported for RLE"); + } + } + + private class Encoder { + // While encoding RLE, this class internally work as a state machine + // INIT state is the initial state before any value comes + // START state is the start for each run + // REPEATED_RUN state means it is collecting repeated values (`lastValue`) + // NONREPEATED_RUN state means it is collecting non-repeated values (`nonRepeatValues`) + private RUN_STATE runState; + + // count for each run, either REPEATED_RUN or NONREPEATED_RUN + private int valueCount; + + // collected value for REPEATED_RUN + private Object lastValue; + + // collected value for NONREPEATED_RUN + private List<Object> nonRepeatValues; + + // data type of input page + private DataType dataType; + + // output stream for encoded data + private ByteArrayOutputStream bao; + private DataOutputStream stream; + + private Encoder() { + this.runState = RUN_STATE.INIT; + this.valueCount = 0; + this.nonRepeatValues = new ArrayList<>(); + this.bao = new ByteArrayOutputStream(); + this.stream = new DataOutputStream(bao); + } + + private EncodedColumnPage encode(ColumnPage input) throws MemoryException, IOException { + validateDataType(input.getDataType()); + this.dataType = input.getDataType(); + switch (dataType) { + case BYTE: + byte[] bytePage = input.getBytePage(); + for (int i = 0; i < bytePage.length; i++) { + putValue(bytePage[i]); + } + break; + case SHORT: + short[] shortPage = input.getShortPage(); + for (int i = 0; i < shortPage.length; i++) { + putValue(shortPage[i]); + } + break; + case INT: + int[] intPage = input.getIntPage(); + for (int i = 0; i < intPage.length; i++) { + putValue(intPage[i]); + } + break; + case LONG: + long[] longPage = input.getLongPage(); + for (int i = 0; i < longPage.length; i++) { + putValue(longPage[i]); + } + break; + default: + throw new UnsupportedOperationException(input.getDataType() + + " does not support RLE encoding"); + } + byte[] encoded = collectResult(); + SimpleStatsResult stats = (SimpleStatsResult) input.getStatistics(); + return new EncodedMeasurePage( + input.getPageSize(), + encoded, + CodecMetaFactory.createMeta(stats, input.getDataType()), + stats.getNullBits()); + } + + private void putValue(Object value) throws IOException { + if (runState == RUN_STATE.INIT) { + startNewRun(value); + } else { + if (lastValue.equals(value)) { + putRepeatValue(value); + } else { + putNonRepeatValue(value); + } + } + } + + // when last row is reached, write out all collected data + private byte[] collectResult() throws IOException { + switch (runState) { + case REPEATED_RUN: + writeRunLength(valueCount); + writeRunValue(lastValue); + break; + case NONREPEATED_RUN: + writeRunLength(valueCount | 0x80000000); --- End diff -- Ok, I will change to short now. Later we need to change to support page more than 32000 rows, like array<int>. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at [hidden email] or file a JIRA ticket with INFRA. --- |
In reply to this post by qiuchenjian-2
Github user jackylk commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/1240#discussion_r133606402 --- Diff: core/src/main/java/org/apache/carbondata/core/datastore/page/encoding/RLECodec.java --- @@ -0,0 +1,417 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package org.apache.carbondata.core.datastore.page.encoding; + +import java.io.ByteArrayInputStream; +import java.io.ByteArrayOutputStream; +import java.io.DataInputStream; +import java.io.DataOutputStream; +import java.io.IOException; +import java.util.ArrayList; +import java.util.List; + +import org.apache.carbondata.core.datastore.page.ColumnPage; +import org.apache.carbondata.core.datastore.page.ComplexColumnPage; +import org.apache.carbondata.core.datastore.page.statistics.SimpleStatsResult; +import org.apache.carbondata.core.memory.MemoryException; +import org.apache.carbondata.core.metadata.CodecMetaFactory; +import org.apache.carbondata.core.metadata.datatype.DataType; + +/** + * RLE encoding implementation for integral column page. + * This encoding keeps track of repeated-run and non-repeated-run, and make use + * of the highest bit of the length field to indicate the type of run. + * The length field is encoded as 32bits value. + * + * For example: input data {5, 5, 1, 2, 3, 3, 3, 3, 3} will be encoded to + * {0x00, 0x00, 0x00, 0x02, 0x05, (repeated-run, 2 values of 5) + * 0x80, 0x00, 0x00, 0x03, 0x01, 0x02, 0x03, (non-repeated-run, 3 values: 1, 2, 3) + * 0x00, 0x00, 0x00, 0x04, 0x03} (repeated-run, 4 values of 3) + */ +public class RLECodec implements ColumnPageCodec { + + enum RUN_STATE { INIT, START, REPEATED_RUN, NONREPEATED_RUN } + + private DataType dataType; + private int pageSize; + + /** + * New RLECodec + * @param dataType data type of the raw column page before encode + * @param pageSize page size of the raw column page before encode + */ + RLECodec(DataType dataType, int pageSize) { + this.dataType = dataType; + this.pageSize = pageSize; + } + + @Override + public String getName() { + return "RLECodec"; + } + + @Override + public EncodedColumnPage encode(ColumnPage input) throws MemoryException, IOException { + Encoder encoder = new Encoder(); + return encoder.encode(input); + } + + @Override + public EncodedColumnPage[] encodeComplexColumn(ComplexColumnPage input) { + throw new UnsupportedOperationException("complex column does not support RLE encoding"); + } + + @Override + public ColumnPage decode(byte[] input, int offset, int length) throws MemoryException, + IOException { + Decoder decoder = new Decoder(dataType, pageSize); + return decoder.decode(input, offset, length); + } + + // This codec supports integral type only + private void validateDataType(DataType dataType) { + switch (dataType) { + case BYTE: + case SHORT: + case INT: + case LONG: --- End diff -- I think we better make it for integral value only. For double and decimal, their distinct value are more in most of the case. Anyway, we can add it in future if it is required in future PR. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at [hidden email] or file a JIRA ticket with INFRA. --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/1240 SDV Build Success with Spark 2.1, Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/204/ --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at [hidden email] or file a JIRA ticket with INFRA. --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/1240 LGTM --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at [hidden email] or file a JIRA ticket with INFRA. --- |
In reply to this post by qiuchenjian-2
Github user asfgit closed the pull request at:
https://github.com/apache/carbondata/pull/1240 --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at [hidden email] or file a JIRA ticket with INFRA. --- |
Free forum by Nabble | Edit this page |