Question

classic Classic list List threaded Threaded
3 messages Options
Reply | Threaded
Open this post in threaded view
|

Question

lionel061201
Hi dev,
Any one knows why the decimal type in compaction flow is processed as below
in CarbonFactDataHandlerColumnar  ?
I can't understand according to the comments.

// convert measure columns
for (int i = 0; i < type.length; i++) {
  Object value = rows[i];

  // in compaction flow the measure with decimal type will come as
spark decimal.
  // need to convert it to byte array.
  if (type[i] == DataType.DECIMAL && compactionFlow) {
    BigDecimal bigDecimal = ((Decimal) rows[i]).toJavaBigDecimal();
    value = DataTypeUtil.bigDecimalToByte(bigDecimal);
  }
  measurePage[i].putData(rowId, value);
}


Thanks!
Lionel
Reply | Threaded
Open this post in threaded view
|

Re: Question

ravipesala
Hi,

it is because compaction flow uses query flow, It queries the data from the
segments which needs to be compacted and sends for merge sort. So writer
step gets the spark row data thats why it has spark decimal in compaction.

Regards,
Ravindra.

On 20 June 2017 at 16:06, Lu Cao <[hidden email]> wrote:

> Hi dev,
> Any one knows why the decimal type in compaction flow is processed as below
> in CarbonFactDataHandlerColumnar  ?
> I can't understand according to the comments.
>
> // convert measure columns
> for (int i = 0; i < type.length; i++) {
>   Object value = rows[i];
>
>   // in compaction flow the measure with decimal type will come as
> spark decimal.
>   // need to convert it to byte array.
>   if (type[i] == DataType.DECIMAL && compactionFlow) {
>     BigDecimal bigDecimal = ((Decimal) rows[i]).toJavaBigDecimal();
>     value = DataTypeUtil.bigDecimalToByte(bigDecimal);
>   }
>   measurePage[i].putData(rowId, value);
> }
>
>
> Thanks!
> Lionel
>



--
Thanks & Regards,
Ravi
Reply | Threaded
Open this post in threaded view
|

Re: Question

Cao Lu 曹鲁
Got it, thank you Ravi!




On 6/20/17, 11:11 PM, "Ravindra Pesala" <[hidden email]> wrote:

>Hi,
>
>it is because compaction flow uses query flow, It queries the data from the
>segments which needs to be compacted and sends for merge sort. So writer
>step gets the spark row data thats why it has spark decimal in compaction.
>
>Regards,
>Ravindra.
>
>On 20 June 2017 at 16:06, Lu Cao <[hidden email]> wrote:
>
>> Hi dev,
>> Any one knows why the decimal type in compaction flow is processed as below
>> in CarbonFactDataHandlerColumnar  ?
>> I can't understand according to the comments.
>>
>> // convert measure columns
>> for (int i = 0; i < type.length; i++) {
>>   Object value = rows[i];
>>
>>   // in compaction flow the measure with decimal type will come as
>> spark decimal.
>>   // need to convert it to byte array.
>>   if (type[i] == DataType.DECIMAL && compactionFlow) {
>>     BigDecimal bigDecimal = ((Decimal) rows[i]).toJavaBigDecimal();
>>     value = DataTypeUtil.bigDecimalToByte(bigDecimal);
>>   }
>>   measurePage[i].putData(rowId, value);
>> }
>>
>>
>> Thanks!
>> Lionel
>>
>
>
>
>--
>Thanks & Regards,
>Ravi
邮件免责申明----- 该电子邮件中的信息是保密的,除收件人外任何人无权访问此电子邮件。 如果您不是收件人,公开、复制、分发或基于此封邮件的任何行动,都是禁止的,并可能是违法的。该邮件包含的任何意见与建议均应遵循上汽集团关于信息传递与保密的制度或规定。除经上汽集团信函以正式书面方式确认外,任何相关的内容或信息不得作为正式依据。 Email Disclaimer----- The information in this email is confidential and may be legally privileged. It is intended solely for the addressee. Access to this email by anyone else is unauthorized. If you are not the intended recipient, any disclosure, copying, distribution or any action taken or omitted to be taken in reliance on it, is prohibited and may be unlawful. Any opinions or advice contained in this email are subject to the terms and conditions expressed in the governing SAICMOTOR client engagement letter and should not be relied upon unless they are confirmed in writing on SAICMOTOR's letterhead.