Hi dev,
Any one knows why the decimal type in compaction flow is processed as below in CarbonFactDataHandlerColumnar ? I can't understand according to the comments. // convert measure columns for (int i = 0; i < type.length; i++) { Object value = rows[i]; // in compaction flow the measure with decimal type will come as spark decimal. // need to convert it to byte array. if (type[i] == DataType.DECIMAL && compactionFlow) { BigDecimal bigDecimal = ((Decimal) rows[i]).toJavaBigDecimal(); value = DataTypeUtil.bigDecimalToByte(bigDecimal); } measurePage[i].putData(rowId, value); } Thanks! Lionel |
Hi,
it is because compaction flow uses query flow, It queries the data from the segments which needs to be compacted and sends for merge sort. So writer step gets the spark row data thats why it has spark decimal in compaction. Regards, Ravindra. On 20 June 2017 at 16:06, Lu Cao <[hidden email]> wrote: > Hi dev, > Any one knows why the decimal type in compaction flow is processed as below > in CarbonFactDataHandlerColumnar ? > I can't understand according to the comments. > > // convert measure columns > for (int i = 0; i < type.length; i++) { > Object value = rows[i]; > > // in compaction flow the measure with decimal type will come as > spark decimal. > // need to convert it to byte array. > if (type[i] == DataType.DECIMAL && compactionFlow) { > BigDecimal bigDecimal = ((Decimal) rows[i]).toJavaBigDecimal(); > value = DataTypeUtil.bigDecimalToByte(bigDecimal); > } > measurePage[i].putData(rowId, value); > } > > > Thanks! > Lionel > -- Thanks & Regards, Ravi |
Got it, thank you Ravi!
On 6/20/17, 11:11 PM, "Ravindra Pesala" <[hidden email]> wrote: >Hi, > >it is because compaction flow uses query flow, It queries the data from the >segments which needs to be compacted and sends for merge sort. So writer >step gets the spark row data thats why it has spark decimal in compaction. > >Regards, >Ravindra. > >On 20 June 2017 at 16:06, Lu Cao <[hidden email]> wrote: > >> Hi dev, >> Any one knows why the decimal type in compaction flow is processed as below >> in CarbonFactDataHandlerColumnar ? >> I can't understand according to the comments. >> >> // convert measure columns >> for (int i = 0; i < type.length; i++) { >> Object value = rows[i]; >> >> // in compaction flow the measure with decimal type will come as >> spark decimal. >> // need to convert it to byte array. >> if (type[i] == DataType.DECIMAL && compactionFlow) { >> BigDecimal bigDecimal = ((Decimal) rows[i]).toJavaBigDecimal(); >> value = DataTypeUtil.bigDecimalToByte(bigDecimal); >> } >> measurePage[i].putData(rowId, value); >> } >> >> >> Thanks! >> Lionel >> > > > >-- >Thanks & Regards, >Ravi |
Free forum by Nabble | Edit this page |