> Remove unused code for spark 2.0 datatype utils
> -----------------------------------------------
>
> Key: CARBONDATA-591
> URL:
https://issues.apache.org/jira/browse/CARBONDATA-591> Project: CarbonData
> Issue Type: Improvement
> Reporter: sandeep purohit
> Priority: Trivial
> Attachments: screenshot-1.png
>
> Time Spent: 0.5h
> Remaining Estimate: 0h
>
> Remove unused code for data type utils for spark 2.0. I look the below code snippet and debug that there is code for spark 2.x datatype conversion in DataTypeConverterUtil.scala.
> {code:title=DataTypeConverterUtil.scala|theme=FadeToGrey|linenumbers=true|language=html/xml|firstline=0001|collapse=true}
> def convertToCarbonTypeForSpark2(dataType: String): DataType = {
> dataType.toLowerCase match {
> case "stringtype" => DataType.STRING
> case "inttype" => DataType.INT
> case "integertype" => DataType.INT
> case "tinyinttype" => DataType.SHORT
> case "shorttype" => DataType.SHORT
> case "longtype" => DataType.LONG
> case "biginttype" => DataType.LONG
> case "numerictype" => DataType.DOUBLE
> case "doubletype" => DataType.DOUBLE
> case "decimaltype" => DataType.DECIMAL
> case "timestamptype" => DataType.TIMESTAMP
> case "datetype" => DataType.DATE
> case "arraytype" => DataType.ARRAY
> case "structtype" => DataType.STRUCT
> case _ => sys.error(s"Unsupported data type: $dataType")
> }
> }
> {code}
> In spark 2.x there is types stringtype and inttype etc as a API not in the query itself.