[jira] [Updated] (CARBONDATA-591) Remove unused code for spark 2.0 datatype utils

classic Classic list List threaded Threaded
1 message Options
Reply | Threaded
Open this post in threaded view
|

[jira] [Updated] (CARBONDATA-591) Remove unused code for spark 2.0 datatype utils

Akash R Nilugal (Jira)

     [ https://issues.apache.org/jira/browse/CARBONDATA-591?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

sandeep purohit updated CARBONDATA-591:
---------------------------------------
    Description:
Remove unused code for data type utils for spark 2.0. I look the below code snippet and debug that there is code for spark 2.x datatype conversion in  DataTypeConverterUtil.scala.

{code:title=This is my title|theme=FadeToGrey|linenumbers=true|language=html/xml|firstline=0001|collapse=true}
  def convertToCarbonTypeForSpark2(dataType: String): DataType = {
    dataType.toLowerCase match {
      case "stringtype" => DataType.STRING
      case "inttype" => DataType.INT
      case "integertype" => DataType.INT
      case "tinyinttype" => DataType.SHORT
      case "shorttype" => DataType.SHORT
      case "longtype" => DataType.LONG
      case "biginttype" => DataType.LONG
      case "numerictype" => DataType.DOUBLE
      case "doubletype" => DataType.DOUBLE
      case "decimaltype" => DataType.DECIMAL
      case "timestamptype" => DataType.TIMESTAMP
      case "datetype" => DataType.DATE
      case "arraytype" => DataType.ARRAY
      case "structtype" => DataType.STRUCT
      case _ => sys.error(s"Unsupported data type: $dataType")
    }
}
{code}

In spark 2.x there is types stringtype and inttype etc as a API not in the query itself.

  was:
Remove unused code for data type utils for spark 2.0. I look the below code snippet and debug that there is code for spark 2.x datatype conversion in  DataTypeConverterUtil.scala.

  def convertToCarbonTypeForSpark2(dataType: String): DataType = {
    dataType.toLowerCase match {
      case "stringtype" => DataType.STRING
      case "inttype" => DataType.INT
      case "integertype" => DataType.INT
      case "tinyinttype" => DataType.SHORT
      case "shorttype" => DataType.SHORT
      case "longtype" => DataType.LONG
      case "biginttype" => DataType.LONG
      case "numerictype" => DataType.DOUBLE
      case "doubletype" => DataType.DOUBLE
      case "decimaltype" => DataType.DECIMAL
      case "timestamptype" => DataType.TIMESTAMP
      case "datetype" => DataType.DATE
      case "arraytype" => DataType.ARRAY
      case "structtype" => DataType.STRUCT
      case _ => sys.error(s"Unsupported data type: $dataType")
    }
}

In spark 2.x there is types stringtype and inttype etc as a API not in the query itself.


> Remove unused code for spark 2.0 datatype utils
> -----------------------------------------------
>
>                 Key: CARBONDATA-591
>                 URL: https://issues.apache.org/jira/browse/CARBONDATA-591
>             Project: CarbonData
>          Issue Type: Improvement
>            Reporter: sandeep purohit
>            Priority: Trivial
>         Attachments: screenshot-1.png
>
>
> Remove unused code for data type utils for spark 2.0. I look the below code snippet and debug that there is code for spark 2.x datatype conversion in  DataTypeConverterUtil.scala.
> {code:title=This is my title|theme=FadeToGrey|linenumbers=true|language=html/xml|firstline=0001|collapse=true}
>   def convertToCarbonTypeForSpark2(dataType: String): DataType = {
>     dataType.toLowerCase match {
>       case "stringtype" => DataType.STRING
>       case "inttype" => DataType.INT
>       case "integertype" => DataType.INT
>       case "tinyinttype" => DataType.SHORT
>       case "shorttype" => DataType.SHORT
>       case "longtype" => DataType.LONG
>       case "biginttype" => DataType.LONG
>       case "numerictype" => DataType.DOUBLE
>       case "doubletype" => DataType.DOUBLE
>       case "decimaltype" => DataType.DECIMAL
>       case "timestamptype" => DataType.TIMESTAMP
>       case "datetype" => DataType.DATE
>       case "arraytype" => DataType.ARRAY
>       case "structtype" => DataType.STRUCT
>       case _ => sys.error(s"Unsupported data type: $dataType")
>     }
> }
> {code}
> In spark 2.x there is types stringtype and inttype etc as a API not in the query itself.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)