[GitHub] carbondata pull request #1065: Add short int type support

classic Classic list List threaded Threaded
43 messages Options
123
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata pull request #1065: Add short int type support

qiuchenjian-2
GitHub user jackylk opened a pull request:

    https://github.com/apache/carbondata/pull/1065

    Add short int type support

    This PR add 3 bytes data type support in value compression. It can be used to compress int/long to 3 bytes bits value and decode to int/long.
    In subsequent PR, it will be used for value compression for dictionary dimension column.
   
    Two test cases are added to test boundary value of 3 bytes

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/jackylk/incubator-carbondata 3bytes

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/carbondata/pull/1065.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #1065
   
----
commit 25d47ec3a7faa416f97340128536e0aac278ad49
Author: jackylk <[hidden email]>
Date:   2017-06-20T06:32:38Z

    add short int type support

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [hidden email] or file a JIRA ticket
with INFRA.
---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata issue #1065: Add short int type support

qiuchenjian-2
Github user asfgit commented on the issue:

    https://github.com/apache/carbondata/pull/1065
 
   
    Refer to this link for build results (access rights to CI server needed):
    https://builds.apache.org/job/carbondata-pr-spark-1.6/505/



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [hidden email] or file a JIRA ticket
with INFRA.
---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata issue #1065: [CARBONDATA-1196] Add 3 bytes data type support in v...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:

    https://github.com/apache/carbondata/pull/1065
 
    Build Failed  with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder/2602/



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [hidden email] or file a JIRA ticket
with INFRA.
---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata issue #1065: [CARBONDATA-1196] Add 3 bytes data type support in v...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user asfgit commented on the issue:

    https://github.com/apache/carbondata/pull/1065
 
   
    Refer to this link for build results (access rights to CI server needed):
    https://builds.apache.org/job/carbondata-pr-spark-1.6/506/<h2>Build result: FAILURE</span></h2>[...truncated 1013.30 KB...] at hudson.remoting.UserRequest.deserialize(UserRequest.java:217) at hudson.remoting.UserRequest.perform(UserRequest.java:131) at hudson.remoting.UserRequest.perform(UserRequest.java:50) at hudson.remoting.Request$2.run(Request.java:336) at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:68) at java.util.concurrent.FutureTask.run(FutureTask.java:262) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:745) at ......remote call to ubuntu-2(Native Method) at hudson.remoting.Channel.attachCallSiteStackTrace(Channel.java:1545) at hudson.remoting.UserResponse.retrieve(UserRequest.java:253) at hudson.remoting.Channel.call(Channel.java:830)Caused: java.io.IOException: Remote call on
 ubuntu-2 failed at hudson.remoting.Channel.call(Channel.java:838) at hudson.Launcher$RemoteLauncher.kill(Launcher.java:984) at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:540) at hudson.model.Run.execute(Run.java:1728) at hudson.maven.MavenModuleSetBuild.run(MavenModuleSetBuild.java:542) at hudson.model.ResourceController.execute(ResourceController.java:98) at hudson.model.Executor.run(Executor.java:405)Setting status of de7075b129d6ae52b0c9d2205fe1f5ffcfac7b4d to FAILURE with url https://builds.apache.org/job/carbondata-pr-spark-1.6/506/ and message: 'Tests Failed for Spark1.6 'Using context: Jenkins(Spark1.6): mvn clean test -Pspark-1.6



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [hidden email] or file a JIRA ticket
with INFRA.
---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata issue #1065: [CARBONDATA-1196] Add 3 bytes data type support in v...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:

    https://github.com/apache/carbondata/pull/1065
 
    Build Failed  with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder/2603/



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [hidden email] or file a JIRA ticket
with INFRA.
---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata issue #1065: [CARBONDATA-1196] Add 3 bytes data type support in v...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user asfgit commented on the issue:

    https://github.com/apache/carbondata/pull/1065
 
   
    Refer to this link for build results (access rights to CI server needed):
    https://builds.apache.org/job/carbondata-pr-spark-1.6/510/



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [hidden email] or file a JIRA ticket
with INFRA.
---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata issue #1065: [CARBONDATA-1196] Add 3 bytes data type support in v...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user asfgit commented on the issue:

    https://github.com/apache/carbondata/pull/1065
 
   
    Refer to this link for build results (access rights to CI server needed):
    https://builds.apache.org/job/carbondata-pr-spark-1.6/512/



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [hidden email] or file a JIRA ticket
with INFRA.
---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata issue #1065: [CARBONDATA-1196] Add 3 bytes data type support in v...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:

    https://github.com/apache/carbondata/pull/1065
 
    Build Failed  with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder/2607/



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [hidden email] or file a JIRA ticket
with INFRA.
---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata issue #1065: [CARBONDATA-1196] Add 3 bytes data type support in v...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:

    https://github.com/apache/carbondata/pull/1065
 
    Build Failed  with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder/2609/



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [hidden email] or file a JIRA ticket
with INFRA.
---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata issue #1065: [CARBONDATA-1196] Add 3 bytes data type support in v...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:

    https://github.com/apache/carbondata/pull/1065
 
    Build Failed  with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder/2616/



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [hidden email] or file a JIRA ticket
with INFRA.
---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata issue #1065: [CARBONDATA-1196] Add 3 bytes data type support in v...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user asfgit commented on the issue:

    https://github.com/apache/carbondata/pull/1065
 
   
    Refer to this link for build results (access rights to CI server needed):
    https://builds.apache.org/job/carbondata-pr-spark-1.6/521/<h2>Failed Tests: <span class='status-failure'>68</span></h2><h3><a name='carbondata-pr-spark-1.6/org.apache.carbondata:carbondata-spark' /><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/521/org.apache.carbondata$carbondata-spark/testReport'>carbondata-pr-spark-1.6/org.apache.carbondata:carbondata-spark</a>: <span class='status-failure'>3</span></h3><ul><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/521/org.apache.carbondata$carbondata-spark/testReport/org.apache.carbondata.integration.spark.testsuite.complexType/TestComplexPrimitiveTimestampDirectDictionary/select___query/'><strong>org.apache.carbondata.integration.spark.testsuite.complexType.TestComplexPrimitiveTimestampDirectDictionary.select * query</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/521/org.apache.carbondata$carbondata-spark/testReport/org.apache.carbondata.spark.testsuite.allqueries/
 InsertIntoCarbonTableSpark1TestCase/insert_from_carbon_select_columns_source_table_has_more_column_then_target_column/'><strong>org.apache.carbondata.spark.testsuite.allqueries.InsertIntoCarbonTableSpark1TestCase.insert from carbon-select columns-source table has more column then target column</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/521/org.apache.carbondata$carbondata-spark/testReport/org.apache.carbondata.spark.testsuite.allqueries/InsertIntoCarbonTableSpark1TestCase/insert__hive_column_more_than_carbon_column__success/'><strong>org.apache.carbondata.spark.testsuite.allqueries.InsertIntoCarbonTableSpark1TestCase.insert->hive column more than carbon column->success</strong></a></li></ul><h3><a name='carbondata-pr-spark-1.6/org.apache.carbondata:carbondata-spark-common-test' /><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/521/org.apache.carbondata$carbondata-spark-common-test/testReport'>carbondata-pr-spark-1.6/org.apache.
 carbondata:carbondata-spark-common-test</a>: <span class='status-failure'>65</span></h3><ul><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/521/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.integration.spark.testsuite.complexType/TestComplexTypeQuery/Test_____special_character_data_loading_for_complex_types/'><strong>org.apache.carbondata.integration.spark.testsuite.complexType.TestComplexTypeQuery.Test ^ * special character data loading for complex types</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/521/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.integration.spark.testsuite.complexType/TestComplexTypeQuery/select___from_complexcarbontable/'><strong>org.apache.carbondata.integration.spark.testsuite.complexType.TestComplexTypeQuery.select * from complexcarbontable</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/521/
 org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.integration.spark.testsuite.complexType/TestCreateTableWithDouble/test_creating_carbon_table_with_double_in_complex_type/'><strong>org.apache.carbondata.integration.spark.testsuite.complexType.TestCreateTableWithDouble.test creating carbon table with double in complex type</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/521/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.integration.spark.testsuite.dataload/TestLoadDataGeneral/test_data_loading_with_invalid_values_for_mesasures/'><strong>org.apache.carbondata.integration.spark.testsuite.dataload.TestLoadDataGeneral.test data loading with invalid values for mesasures</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/521/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.integration.spark.testsuite.primitiveTypes/Fl
 oatDataTypeTestCase/select_row_whose_rating_is_more_than_2_8_from_tfloat/'><strong>org.apache.carbondata.integration.spark.testsuite.primitiveTypes.FloatDataTypeTestCase.select row whose rating is more than 2.8 from tfloat</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/521/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.integration.spark.testsuite.primitiveTypes/FloatDataTypeTestCase/select_row_whose_rating_is_3_5_from_tfloat/'><strong>org.apache.carbondata.integration.spark.testsuite.primitiveTypes.FloatDataTypeTestCase.select row whose rating is 3.5 from tfloat</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/521/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.integration.spark.testsuite.primitiveTypes/FloatDataTypeTestCase/select_sum_of_rating_column_from_tfloat/'><strong>org.apache.carbondata.integration.spark.testsuite.primitiveTypes.Fl
 oatDataTypeTestCase.select sum of rating column from tfloat</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/521/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.allqueries/InsertIntoCarbonTableTestCase/insert_from_hive/'><strong>org.apache.carbondata.spark.testsuite.allqueries.InsertIntoCarbonTableTestCase.insert from hive</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/521/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.allqueries/InsertIntoCarbonTableTestCase/insert_from_carbon_select_columns/'><strong>org.apache.carbondata.spark.testsuite.allqueries.InsertIntoCarbonTableTestCase.insert from carbon-select columns</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/521/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.allqueries/
 InsertIntoCarbonTableTestCase/insert_from_carbon_select___columns/'><strong>org.apache.carbondata.spark.testsuite.allqueries.InsertIntoCarbonTableTestCase.insert from carbon-select * columns</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/521/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.allqueries/InsertIntoCarbonTableTestCase/insert_into_existing_load_pass/'><strong>org.apache.carbondata.spark.testsuite.allqueries.InsertIntoCarbonTableTestCase.insert into existing load-pass</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/521/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.allqueries/InsertIntoCarbonTableTestCase/insert_into_carbon_table_from_carbon_table_union_query/'><strong>org.apache.carbondata.spark.testsuite.allqueries.InsertIntoCarbonTableTestCase.insert into carbon table from carbon table union quer
 y</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/521/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.allqueries/InsertIntoCarbonTableTestCase/insert_select_from_same_table/'><strong>org.apache.carbondata.spark.testsuite.allqueries.InsertIntoCarbonTableTestCase.insert select from same table</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/521/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.dataload/TestLoadDataWithHiveSyntaxDefaultFormat/test_data_loading_and_validate_query_output/'><strong>org.apache.carbondata.spark.testsuite.dataload.TestLoadDataWithHiveSyntaxDefaultFormat.test data loading and validate query output</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/521/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.dataloa
 d/TestLoadDataWithHiveSyntaxDefaultFormat/test_data_loading_with_different_case_file_header_and_validate_query_output/'><strong>org.apache.carbondata.spark.testsuite.dataload.TestLoadDataWithHiveSyntaxDefaultFormat.test data loading with different case file header and validate query output</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/521/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.dataload/TestLoadDataWithHiveSyntaxDefaultFormat/test_data_loading_when_delimiter_is_____and_data_with_header/'><strong>org.apache.carbondata.spark.testsuite.dataload.TestLoadDataWithHiveSyntaxDefaultFormat.test data loading when delimiter is '|' and data with header</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/521/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.dataload/TestLoadDataWithHiveSyntaxUnsafe/test_data_loading_an
 d_validate_query_output/'><strong>org.apache.carbondata.spark.testsuite.dataload.TestLoadDataWithHiveSyntaxUnsafe.test data loading and validate query output</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/521/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.dataload/TestLoadDataWithHiveSyntaxUnsafe/test_data_loading_with_different_case_file_header_and_validate_query_output/'><strong>org.apache.carbondata.spark.testsuite.dataload.TestLoadDataWithHiveSyntaxUnsafe.test data loading with different case file header and validate query output</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/521/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.dataload/TestLoadDataWithHiveSyntaxUnsafe/test_data_loading_when_delimiter_is_____and_data_with_header/'><strong>org.apache.carbondata.spark.testsuite.dataload.TestLoadDataWithHiv
 eSyntaxUnsafe.test data loading when delimiter is '|' and data with header</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/521/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.dataload/TestLoadDataWithHiveSyntaxV1Format/test_data_loading_and_validate_query_output/'><strong>org.apache.carbondata.spark.testsuite.dataload.TestLoadDataWithHiveSyntaxV1Format.test data loading and validate query output</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/521/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.dataload/TestLoadDataWithHiveSyntaxV1Format/test_data_loading_with_different_case_file_header_and_validate_query_output/'><strong>org.apache.carbondata.spark.testsuite.dataload.TestLoadDataWithHiveSyntaxV1Format.test data loading with different case file header and validate query output</strong></a></li><li><a href='http
 s://builds.apache.org/job/carbondata-pr-spark-1.6/521/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.dataload/TestLoadDataWithHiveSyntaxV1Format/test_data_loading_when_delimiter_is_____and_data_with_header/'><strong>org.apache.carbondata.spark.testsuite.dataload.TestLoadDataWithHiveSyntaxV1Format.test data loading when delimiter is '|' and data with header</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/521/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.dataload/TestLoadDataWithHiveSyntaxV2Format/test_data_loading_and_validate_query_output/'><strong>org.apache.carbondata.spark.testsuite.dataload.TestLoadDataWithHiveSyntaxV2Format.test data loading and validate query output</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/521/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.s
 park.testsuite.dataload/TestLoadDataWithHiveSyntaxV2Format/test_data_loading_with_different_case_file_header_and_validate_query_output/'><strong>org.apache.carbondata.spark.testsuite.dataload.TestLoadDataWithHiveSyntaxV2Format.test data loading with different case file header and validate query output</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/521/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.dataload/TestLoadDataWithHiveSyntaxV2Format/test_data_loading_when_delimiter_is_____and_data_with_header/'><strong>org.apache.carbondata.spark.testsuite.dataload.TestLoadDataWithHiveSyntaxV2Format.test data loading when delimiter is '|' and data with header</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/521/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.detailquery/ValueCompressionDataTypeTestCase/ActualDataType_
 double_ChangedDatatype_Short_CompressionType_NonDecimalMaxMin/'><strong>org.apache.carbondata.spark.testsuite.detailquery.ValueCompressionDataTypeTestCase.ActualDataType:double,ChangedDatatype:Short,CompressionType:NonDecimalMaxMin</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/521/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.detailquery/ValueCompressionDataTypeTestCase/ActualDataType_double_ChangedDatatype_byte_CompressionType_NonDecimalMaxMin/'><strong>org.apache.carbondata.spark.testsuite.detailquery.ValueCompressionDataTypeTestCase.ActualDataType:double,ChangedDatatype:byte,CompressionType:NonDecimalMaxMin</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/521/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.detailquery/ValueCompressionDataTypeTestCase/When_the_values_of_Double_datatype_are_negative_values
 /'><strong>org.apache.carbondata.spark.testsuite.detailquery.ValueCompressionDataTypeTestCase.When the values of Double datatype are negative values</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/521/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.detailquery/ValueCompressionDataTypeTestCase/When_the_values_of_Double_datatype_have_both_postive_and_negative_values/'><strong>org.apache.carbondata.spark.testsuite.detailquery.ValueCompressionDataTypeTestCase.When the values of Double datatype have both postive and negative values</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/521/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.partition/TestAllDataTypeForPartitionTable/allTypeTable_hash_smallInt/'><strong>org.apache.carbondata.spark.testsuite.partition.TestAllDataTypeForPartitionTable.allTypeTable_hash_smallInt<
 /strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/521/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.partition/TestAllDataTypeForPartitionTable/allTypeTable_hash_int/'><strong>org.apache.carbondata.spark.testsuite.partition.TestAllDataTypeForPartitionTable.allTypeTable_hash_int</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/521/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.partition/TestAllDataTypeForPartitionTable/allTypeTable_hash_bigint/'><strong>org.apache.carbondata.spark.testsuite.partition.TestAllDataTypeForPartitionTable.allTypeTable_hash_bigint</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/521/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.partition/TestAllDataTypeForPartitionTable/allTypeTable_hash_float/'><s
 trong>org.apache.carbondata.spark.testsuite.partition.TestAllDataTypeForPartitionTable.allTypeTable_hash_float</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/521/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.partition/TestAllDataTypeForPartitionTable/allTypeTable_hash_double/'><strong>org.apache.carbondata.spark.testsuite.partition.TestAllDataTypeForPartitionTable.allTypeTable_hash_double</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/521/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.partition/TestAllDataTypeForPartitionTable/allTypeTable_hash_decimal/'><strong>org.apache.carbondata.spark.testsuite.partition.TestAllDataTypeForPartitionTable.allTypeTable_hash_decimal</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/521/org.apache.carbondata$carbondata-spark-common-test/t
 estReport/org.apache.carbondata.spark.testsuite.partition/TestAllDataTypeForPartitionTable/allTypeTable_hash_timestamp/'><strong>org.apache.carbondata.spark.testsuite.partition.TestAllDataTypeForPartitionTable.allTypeTable_hash_timestamp</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/521/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.partition/TestAllDataTypeForPartitionTable/allTypeTable_hash_date/'><strong>org.apache.carbondata.spark.testsuite.partition.TestAllDataTypeForPartitionTable.allTypeTable_hash_date</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/521/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.partition/TestAllDataTypeForPartitionTable/allTypeTable_hash_string/'><strong>org.apache.carbondata.spark.testsuite.partition.TestAllDataTypeForPartitionTable.allTypeTable_hash_string</strong></a></li><l
 i><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/521/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.partition/TestAllDataTypeForPartitionTable/allTypeTable_hash_varchar/'><strong>org.apache.carbondata.spark.testsuite.partition.TestAllDataTypeForPartitionTable.allTypeTable_hash_varchar</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/521/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.partition/TestAllDataTypeForPartitionTable/allTypeTable_hash_char/'><strong>org.apache.carbondata.spark.testsuite.partition.TestAllDataTypeForPartitionTable.allTypeTable_hash_char</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/521/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.partition/TestAllDataTypeForPartitionTable/allTypeTable_list_smallInt/'><strong>org.ap
 ache.carbondata.spark.testsuite.partition.TestAllDataTypeForPartitionTable.allTypeTable_list_smallInt</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/521/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.partition/TestAllDataTypeForPartitionTable/allTypeTable_list_int/'><strong>org.apache.carbondata.spark.testsuite.partition.TestAllDataTypeForPartitionTable.allTypeTable_list_int</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/521/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.partition/TestAllDataTypeForPartitionTable/allTypeTable_list_bigint/'><strong>org.apache.carbondata.spark.testsuite.partition.TestAllDataTypeForPartitionTable.allTypeTable_list_bigint</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/521/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apa
 che.carbondata.spark.testsuite.partition/TestAllDataTypeForPartitionTable/allTypeTable_list_float/'><strong>org.apache.carbondata.spark.testsuite.partition.TestAllDataTypeForPartitionTable.allTypeTable_list_float</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/521/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.partition/TestAllDataTypeForPartitionTable/allTypeTable_list_double/'><strong>org.apache.carbondata.spark.testsuite.partition.TestAllDataTypeForPartitionTable.allTypeTable_list_double</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/521/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.partition/TestAllDataTypeForPartitionTable/allTypeTable_list_decimal/'><strong>org.apache.carbondata.spark.testsuite.partition.TestAllDataTypeForPartitionTable.allTypeTable_list_decimal</strong></a></li><li><a href='https://
 builds.apache.org/job/carbondata-pr-spark-1.6/521/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.partition/TestAllDataTypeForPartitionTable/allTypeTable_list_timestamp/'><strong>org.apache.carbondata.spark.testsuite.partition.TestAllDataTypeForPartitionTable.allTypeTable_list_timestamp</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/521/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.partition/TestAllDataTypeForPartitionTable/allTypeTable_list_date/'><strong>org.apache.carbondata.spark.testsuite.partition.TestAllDataTypeForPartitionTable.allTypeTable_list_date</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/521/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.partition/TestAllDataTypeForPartitionTable/allTypeTable_list_string/'><strong>org.apache.carbondata.s
 park.testsuite.partition.TestAllDataTypeForPartitionTable.allTypeTable_list_string</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/521/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.partition/TestAllDataTypeForPartitionTable/allTypeTable_list_varchar/'><strong>org.apache.carbondata.spark.testsuite.partition.TestAllDataTypeForPartitionTable.allTypeTable_list_varchar</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/521/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.partition/TestAllDataTypeForPartitionTable/allTypeTable_list_char/'><strong>org.apache.carbondata.spark.testsuite.partition.TestAllDataTypeForPartitionTable.allTypeTable_list_char</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/521/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.
 spark.testsuite.partition/TestAllDataTypeForPartitionTable/allTypeTable_range_smallInt/'><strong>org.apache.carbondata.spark.testsuite.partition.TestAllDataTypeForPartitionTable.allTypeTable_range_smallInt</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/521/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.partition/TestAllDataTypeForPartitionTable/allTypeTable_range_int/'><strong>org.apache.carbondata.spark.testsuite.partition.TestAllDataTypeForPartitionTable.allTypeTable_range_int</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/521/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.partition/TestAllDataTypeForPartitionTable/allTypeTable_range_bigint/'><strong>org.apache.carbondata.spark.testsuite.partition.TestAllDataTypeForPartitionTable.allTypeTable_range_bigint</strong></a></li><li><a href='https://builds.apac
 he.org/job/carbondata-pr-spark-1.6/521/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.partition/TestAllDataTypeForPartitionTable/allTypeTable_range_float/'><strong>org.apache.carbondata.spark.testsuite.partition.TestAllDataTypeForPartitionTable.allTypeTable_range_float</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/521/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.partition/TestAllDataTypeForPartitionTable/allTypeTable_range_double/'><strong>org.apache.carbondata.spark.testsuite.partition.TestAllDataTypeForPartitionTable.allTypeTable_range_double</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/521/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.partition/TestAllDataTypeForPartitionTable/allTypeTable_range_decimal/'><strong>org.apache.carbondata.spark.test
 suite.partition.TestAllDataTypeForPartitionTable.allTypeTable_range_decimal</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/521/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.partition/TestAllDataTypeForPartitionTable/allTypeTable_range_timestamp/'><strong>org.apache.carbondata.spark.testsuite.partition.TestAllDataTypeForPartitionTable.allTypeTable_range_timestamp</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/521/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.partition/TestAllDataTypeForPartitionTable/allTypeTable_range_date/'><strong>org.apache.carbondata.spark.testsuite.partition.TestAllDataTypeForPartitionTable.allTypeTable_range_date</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/521/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata
 .spark.testsuite.partition/TestAllDataTypeForPartitionTable/allTypeTable_range_string/'><strong>org.apache.carbondata.spark.testsuite.partition.TestAllDataTypeForPartitionTable.allTypeTable_range_string</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/521/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.partition/TestAllDataTypeForPartitionTable/allTypeTable_range_varchar/'><strong>org.apache.carbondata.spark.testsuite.partition.TestAllDataTypeForPartitionTable.allTypeTable_range_varchar</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/521/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.partition/TestAllDataTypeForPartitionTable/allTypeTable_range_char/'><strong>org.apache.carbondata.spark.testsuite.partition.TestAllDataTypeForPartitionTable.allTypeTable_range_char</strong></a></li><li><a href='https://builds.apa
 che.org/job/carbondata-pr-spark-1.6/521/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.windowsexpr/WindowsExprTestCase/SELECT_country_name_salary_FROM__SELECT_country_name_salary_dense_rank___OVER__PARTITION_BY_country_ORDER_BY_salary_DESC__as_rank_FROM_windowstable__tmp_WHERE_rank____2_order_by_country/'><strong>org.apache.carbondata.spark.testsuite.windowsexpr.WindowsExprTestCase.SELECT country,name,salary FROM (SELECT country,name,salary,dense_rank() OVER (PARTITION BY country ORDER BY salary DESC) as rank FROM windowstable) tmp WHERE rank <= 2 order by country</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/521/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.windowsexpr/WindowsExprTestCase/SELECT_ID__country__SUM_salary__OVER__PARTITION_BY_country___AS_TopBorcT_FROM_windowstable/'><strong>org.apache.carbondata.spark.testsuite.windowsexpr.Wi
 ndowsExprTestCase.SELECT ID, country, SUM(salary) OVER (PARTITION BY country ) AS TopBorcT FROM windowstable</strong></a></li><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/521/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.windowsexpr/WindowsExprTestCase/SELECT_country_name_salary_ROW_NUMBER___OVER__PARTITION_BY_country_ORDER_BY_salary_DESC__as_rownum_FROM_windowstable/'><strong>org.apache.carbondata.spark.testsuite.windowsexpr.WindowsExprTestCase.SELECT country,name,salary,ROW_NUMBER() OVER (PARTITION BY country ORDER BY salary DESC) as rownum FROM windowstable</strong></a></li></ul>



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [hidden email] or file a JIRA ticket
with INFRA.
---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata issue #1065: [CARBONDATA-1196] Add 3 bytes data type support in v...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:

    https://github.com/apache/carbondata/pull/1065
 
    Build Failed  with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder/2617/



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [hidden email] or file a JIRA ticket
with INFRA.
---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata issue #1065: [CARBONDATA-1196] Add 3 bytes data type support in v...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user asfgit commented on the issue:

    https://github.com/apache/carbondata/pull/1065
 
   
    Refer to this link for build results (access rights to CI server needed):
    https://builds.apache.org/job/carbondata-pr-spark-1.6/522/



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [hidden email] or file a JIRA ticket
with INFRA.
---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata pull request #1065: [CARBONDATA-1196] Add 3 bytes data type suppo...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user QiangCai commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1065#discussion_r123148990
 
    --- Diff: core/src/main/java/org/apache/carbondata/core/util/ByteUtil.java ---
    @@ -465,6 +465,39 @@ public static short toShort(byte[] bytes, int offset, final int length) {
       }
     
       /**
    +   * int => byte[3]
    +   * supported range is [-8388607, 8388607], note that Math.pow(2, 24) == 8388608
    +   */
    +  public static byte[] to3Bytes(int val) {
    +    assert val <= Math.pow(2, 23) && val >= (-Math.pow(2, 23) - 1);
    +
    +    int value = val < 0 ? -val : val;
    +    byte[] b = new byte[3];
    +    b[0] = (byte) (value & 0xFF);
    +    b[1] = (byte) ((value >>> 8) & 0xFF);
    +    b[2] = (byte) ((value >>> 16) & 0x7F);
    --- End diff --
   
    I think we can do as following. no need to judge minus.
    b[0] = (byte)(x >> 16)
    b[1] = (byte)(x >> 8)
    b[2] = (byte)(x)


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [hidden email] or file a JIRA ticket
with INFRA.
---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata pull request #1065: [CARBONDATA-1196] Add 3 bytes data type suppo...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user jackylk commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1065#discussion_r123154384
 
    --- Diff: core/src/main/java/org/apache/carbondata/core/util/ByteUtil.java ---
    @@ -465,6 +465,39 @@ public static short toShort(byte[] bytes, int offset, final int length) {
       }
     
       /**
    +   * int => byte[3]
    +   * supported range is [-8388607, 8388607], note that Math.pow(2, 24) == 8388608
    +   */
    +  public static byte[] to3Bytes(int val) {
    +    assert val <= Math.pow(2, 23) && val >= (-Math.pow(2, 23) - 1);
    +
    +    int value = val < 0 ? -val : val;
    +    byte[] b = new byte[3];
    +    b[0] = (byte) (value & 0xFF);
    +    b[1] = (byte) ((value >>> 8) & 0xFF);
    +    b[2] = (byte) ((value >>> 16) & 0x7F);
    --- End diff --
   
    Because in line474, value is converted to absolute value, so in line479 it needs to keep the "sign bit", otherwise the "sign bit" information will be lost


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [hidden email] or file a JIRA ticket
with INFRA.
---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata issue #1065: [CARBONDATA-1196] Add 3 bytes data type support in v...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:

    https://github.com/apache/carbondata/pull/1065
 
    Build Failed  with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder/2619/



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [hidden email] or file a JIRA ticket
with INFRA.
---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata issue #1065: [CARBONDATA-1196] Add 3 bytes data type support in v...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user asfgit commented on the issue:

    https://github.com/apache/carbondata/pull/1065
 
   
    Refer to this link for build results (access rights to CI server needed):
    https://builds.apache.org/job/carbondata-pr-spark-1.6/525/<h2>Failed Tests: <span class='status-failure'>1</span></h2><h3><a name='carbondata-pr-spark-1.6/org.apache.carbondata:carbondata-spark-common-test' /><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/525/org.apache.carbondata$carbondata-spark-common-test/testReport'>carbondata-pr-spark-1.6/org.apache.carbondata:carbondata-spark-common-test</a>: <span class='status-failure'>1</span></h3><ul><li><a href='https://builds.apache.org/job/carbondata-pr-spark-1.6/525/org.apache.carbondata$carbondata-spark-common-test/testReport/org.apache.carbondata.spark.testsuite.allqueries/InsertIntoCarbonTableTestCase/insert_into_carbon_table_from_carbon_table_union_query/'><strong>org.apache.carbondata.spark.testsuite.allqueries.InsertIntoCarbonTableTestCase.insert into carbon table from carbon table union query</strong></a></li></ul>



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [hidden email] or file a JIRA ticket
with INFRA.
---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata pull request #1065: [CARBONDATA-1196] Add 3 bytes data type suppo...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user QiangCai commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1065#discussion_r123169106
 
    --- Diff: core/src/main/java/org/apache/carbondata/core/util/ByteUtil.java ---
    @@ -465,6 +465,39 @@ public static short toShort(byte[] bytes, int offset, final int length) {
       }
     
       /**
    +   * int => byte[3]
    +   * supported range is [-8388607, 8388607], note that Math.pow(2, 24) == 8388608
    +   */
    +  public static byte[] to3Bytes(int val) {
    +    assert val <= Math.pow(2, 23) && val >= (-Math.pow(2, 23) - 1);
    +
    +    int value = val < 0 ? -val : val;
    +    byte[] b = new byte[3];
    +    b[0] = (byte) (value & 0xFF);
    +    b[1] = (byte) ((value >>> 8) & 0xFF);
    +    b[2] = (byte) ((value >>> 16) & 0x7F);
    --- End diff --
   
    because we already check the range, so it will not lost 'sign bit'.
    It means if the value is nimus, the 24th bit is 1, if the value is positive, the 24th bit is 0.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [hidden email] or file a JIRA ticket
with INFRA.
---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata pull request #1065: [CARBONDATA-1196] Add 3 bytes data type suppo...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user QiangCai commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1065#discussion_r123173346
 
    --- Diff: core/src/main/java/org/apache/carbondata/core/util/ByteUtil.java ---
    @@ -465,6 +465,39 @@ public static short toShort(byte[] bytes, int offset, final int length) {
       }
     
       /**
    +   * int => byte[3]
    +   * supported range is [-8388607, 8388607], note that Math.pow(2, 24) == 8388608
    +   */
    +  public static byte[] to3Bytes(int val) {
    +    assert val <= (Math.pow(2, 23) - 1) && val >= (-Math.pow(2, 23) + 1);
    +
    +    int value = val < 0 ? -val : val;
    +    byte[] b = new byte[3];
    +    b[0] = (byte) (value & 0xFF);
    +    b[1] = (byte) ((value >>> 8) & 0xFF);
    +    b[2] = (byte) ((value >>> 16) & 0x7F);
    +    if (val < 0) {
    +      b[2] |= 0x80;
    +    }
    +    return b;
    +  }
    +
    +  /**
    +   * convert 3 bytes to int
    +   */
    +  public static int valueOf3Bytes(byte[] val, int offset) {
    +    assert val.length >= offset + 3;
    +    int out = (val[offset] & 0xFF);
    +    out |= ((val[offset + 1] & 0xFF) << 8);
    +    out |= ((val[offset + 2] & 0x7F) << 16);
    +    if ((val[offset + 2] & 0x80) != 0) {
    +      return -out;
    +    } else {
    +      return out;
    +    }
    --- End diff --
   
    if (val[offset] >0) {
    return    (((val[offset]         & 0xff) << 16) |
                   ((val[offset + 1]  & 0xff) <<  8) |
                   ((val[offset + 2]  & 0xff)            ));
    } else {
    return    (((val[offset]         & 0xffff) << 16) |
                   ((val[offset + 1]  & 0xff) <<  8) |
                   ((val[offset + 2]  & 0xff)            ));
    }
   



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [hidden email] or file a JIRA ticket
with INFRA.
---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata pull request #1065: [CARBONDATA-1196] Add 3 bytes data type suppo...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user QiangCai commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1065#discussion_r123176128
 
    --- Diff: core/src/main/java/org/apache/carbondata/core/datastore/page/encoding/DefaultEncodingStrategy.java ---
    @@ -29,24 +29,31 @@
     
       private static final Compressor compressor = CompressorFactory.getInstance().getCompressor();
     
    +  private static final int THREE_BYTES_MAX = (int) Math.pow(2, 23) - 1;
    +  private static final int THREE_BYTES_MIN = - THREE_BYTES_MAX;
    --- End diff --
   
    THREE_BYTES_MIN = - (THREE_BYTES_MAX + 1)


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [hidden email] or file a JIRA ticket
with INFRA.
---
123