[GitHub] carbondata pull request #2969: [CARBONDATA-3127]Fix the TestCarbonSerde exce...

classic Classic list List threaded Threaded
103 messages Options
1234 ... 6
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata pull request #2969: [CARBONDATA-3127]Fix the TestCarbonSerde exce...

qiuchenjian-2
GitHub user SteNicholas opened a pull request:

    https://github.com/apache/carbondata/pull/2969

    [CARBONDATA-3127]Fix the TestCarbonSerde exception

    [CARBONDATA-3127]Hive module test case has been commented off,can' t run.
    This pull request fix TestCarbonSerde test case exception with maven compile.Method called deserializeAndSerializeLazySimple of TestCarbonSerde class calls the interface called serializeStartKey which is not exsit in CarbonHiveSerDe class.Just modify replace deserializeAndSerializeLazySimple with deserialize and remove comments of TestCarbonSerde.
   
     - [x] Any interfaces changed?
            No
     - [x] Any backward compatibility impacted?
           No
     - [x] Document update required?
           No
     - [x] Testing done
           Modify TestCarbonSerde test
     - [x] For large changes, please consider breaking it into sub-tasks under an umbrella JIRA.
           No

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/SteNicholas/carbondata CARBONDATA-3127

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/carbondata/pull/2969.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #2969
   
----
commit b53a09c3e76442ebfb868104478a31cca8b23c52
Author: Nicholas Jiang <programgeek@...>
Date:   2018-11-27T06:51:56Z

    update maven profile

commit 06bce3309ded88a9551371a19b3979c6e9ace66d
Author: Nicholas Jiang <programgeek@...>
Date:   2018-11-27T06:57:59Z

    update maven profile

commit 45b3f9cab524188751bbf2c6ec87dd3db4457990
Author: Nicholas Jiang <programgeek@...>
Date:   2018-12-01T12:07:57Z

    merge master branch

commit c13e8e3309fa1a09e26437085cab33ddb65dfc01
Author: Nicholas Jiang <programgeek@...>
Date:   2018-12-01T12:11:48Z

    merge master branch

commit ceeab809493623f92f0166ae1a32424221ea21e0
Author: Nicholas Jiang <programgeek@...>
Date:   2018-12-02T01:06:21Z

    update TestCarbonSerde test

----


---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata issue #2969: [CARBONDATA-3127]Fix the TestCarbonSerde exception

qiuchenjian-2
Github user SteNicholas commented on the issue:

    https://github.com/apache/carbondata/pull/2969
 
    @xubo245 @zzcclp Please review the small update.


---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata issue #2969: [CARBONDATA-3127]Fix the TestCarbonSerde exception

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:

    https://github.com/apache/carbondata/pull/2969
 
    Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder2.1/1609/



---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata issue #2969: [CARBONDATA-3127]Fix the TestCarbonSerde exception

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:

    https://github.com/apache/carbondata/pull/2969
 
    Build Success with Spark 2.3.1, Please check CI http://136.243.101.176:8080/job/carbondataprbuilder2.3/9869/



---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata issue #2969: [CARBONDATA-3127]Fix the TestCarbonSerde exception

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:

    https://github.com/apache/carbondata/pull/2969
 
    Build Success with Spark 2.2.1, Please check CI http://95.216.28.178:8080/job/ApacheCarbonPRBuilder1/1820/



---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata pull request #2969: [CARBONDATA-3127]Fix the TestCarbonSerde exce...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user xubo245 commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/2969#discussion_r238085487
 
    --- Diff: integration/hive/src/test/java/org/apache/carbondata/hive/TestCarbonSerde.java ---
    @@ -1,133 +1,133 @@
    -///*
    -// * Licensed to the Apache Software Foundation (ASF) under one or more
    -// * contributor license agreements.  See the NOTICE file distributed with
    -// * this work for additional information regarding copyright ownership.
    -// * The ASF licenses this file to You under the Apache License, Version 2.0
    -// * (the "License"); you may not use this file except in compliance with
    -// * the License.  You may obtain a copy of the License at
    -// *
    -// *    http://www.apache.org/licenses/LICENSE-2.0
    -// *
    -// * Unless required by applicable law or agreed to in writing, software
    -// * distributed under the License is distributed on an "AS IS" BASIS,
    -// * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    -// * See the License for the specific language governing permissions and
    -// * limitations under the License.
    -// */
    -//package org.apache.carbondata.hive;
    -//
    -//import junit.framework.TestCase;
    -//import org.apache.hadoop.conf.Configuration;
    -//import org.apache.hadoop.hive.common.type.HiveDecimal;
    -//import org.apache.hadoop.hive.serde2.SerDeException;
    -//import org.apache.hadoop.hive.serde2.SerDeUtils;
    -//import org.apache.hadoop.hive.serde2.io.DoubleWritable;
    -//import org.apache.hadoop.hive.serde2.io.HiveDecimalWritable;
    -//import org.apache.hadoop.hive.serde2.io.ShortWritable;
    -//import org.apache.hadoop.hive.serde2.objectinspector.StructObjectInspector;
    -//import org.apache.hadoop.io.*;
    -//import org.junit.Test;
    -//
    -//import java.util.Properties;
    -//
    -//public class TestCarbonSerde extends TestCase {
    -//  @Test
    -//  public void testCarbonHiveSerDe() throws Throwable {
    -//    try {
    -//      // Create the SerDe
    -//      System.out.println("test: testCarbonHiveSerDe");
    -//
    -//      final CarbonHiveSerDe serDe = new CarbonHiveSerDe();
    -//      final Configuration conf = new Configuration();
    -//      final Properties tbl = createProperties();
    -//      SerDeUtils.initializeSerDe(serDe, conf, tbl, null);
    -//
    -//      // Data
    -//      final Writable[] arr = new Writable[7];
    -//
    -//      //primitive types
    -//      arr[0] = new ShortWritable((short) 456);
    -//      arr[1] = new IntWritable(789);
    -//      arr[2] = new LongWritable(1000l);
    -//      arr[3] = new DoubleWritable((double) 5.3);
    -//      arr[4] = new HiveDecimalWritable(HiveDecimal.create(1));
    -//      arr[5] = new Text("carbonSerde binary".getBytes("UTF-8"));
    -//
    -//      final Writable[] arrayContainer = new Writable[1];
    -//      final Writable[] array = new Writable[5];
    -//      for (int i = 0; i < 5; ++i) {
    -//        array[i] = new IntWritable(i);
    -//      }
    -//      arrayContainer[0] = new ArrayWritable(Writable.class, array);
    -//      arr[6] = new ArrayWritable(Writable.class, arrayContainer);
    -//
    -//      final ArrayWritable arrWritable = new ArrayWritable(Writable.class, arr);
    -//      // Test
    -//      deserializeAndSerializeLazySimple(serDe, arrWritable);
    -//      System.out.println("test: testCarbonHiveSerDe - OK");
    -//
    -//    } catch (final Throwable e) {
    -//      e.printStackTrace();
    -//      throw e;
    -//    }
    -//  }
    -//
    -//  private void deserializeAndSerializeLazySimple(final CarbonHiveSerDe serDe,
    -//      final ArrayWritable t) throws SerDeException {
    -//
    -//    // Get the row structure
    -//    final StructObjectInspector oi = (StructObjectInspector) serDe.getObjectInspector();
    -//
    -//    // Deserialize
    -//    final Object row = serDe.deserialize(t);
    -//    assertEquals("deserialization gives the wrong object class", row.getClass(),
    -//        ArrayWritable.class);
    -//    assertEquals("size correct after deserialization",
    -//        serDe.getSerDeStats().getRawDataSize(), t.get().length);
    -//    assertEquals("deserialization gives the wrong object", t, row);
    -//
    -//    // Serialize
    -//    final ArrayWritable serializedArr = (ArrayWritable) serDe.serializeStartKey(row, oi);
    -//    assertEquals("size correct after serialization", serDe.getSerDeStats().getRawDataSize(),
    -//        serializedArr.get().length);
    -//    assertTrue("serialized object should be equal to starting object",
    -//        arrayWritableEquals(t, serializedArr));
    -//  }
    -//
    -//  private Properties createProperties() {
    -//    final Properties tbl = new Properties();
    -//
    -//    // Set the configuration parameters
    -//    tbl.setProperty("columns", "ashort,aint,along,adouble,adecimal,astring,alist");
    -//    tbl.setProperty("columns.types",
    -//        "smallint:int:bigint:double:decimal:string:array<int>");
    -//    tbl.setProperty(org.apache.hadoop.hive.serde.serdeConstants.SERIALIZATION_NULL_FORMAT, "NULL");
    -//    return tbl;
    -//  }
    -//
    -//  public static boolean arrayWritableEquals(final ArrayWritable a1, final ArrayWritable a2) {
    -//    final Writable[] a1Arr = a1.get();
    -//    final Writable[] a2Arr = a2.get();
    -//
    -//    if (a1Arr.length != a2Arr.length) {
    -//      return false;
    -//    }
    -//
    -//    for (int i = 0; i < a1Arr.length; ++i) {
    -//      if (a1Arr[i] instanceof ArrayWritable) {
    -//        if (!(a2Arr[i] instanceof ArrayWritable)) {
    -//          return false;
    -//        }
    -//        if (!arrayWritableEquals((ArrayWritable) a1Arr[i], (ArrayWritable) a2Arr[i])) {
    -//          return false;
    -//        }
    -//      } else {
    -//        if (!a1Arr[i].equals(a2Arr[i])) {
    -//          return false;
    -//        }
    -//      }
    -//
    -//    }
    -//    return true;
    -//  }
    -//}
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *    http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +package org.apache.carbondata.hive;
    +
    +import junit.framework.TestCase;
    +import org.apache.hadoop.conf.Configuration;
    +import org.apache.hadoop.hive.common.type.HiveDecimal;
    +import org.apache.hadoop.hive.serde2.SerDeException;
    +import org.apache.hadoop.hive.serde2.SerDeUtils;
    +import org.apache.hadoop.hive.serde2.io.DoubleWritable;
    +import org.apache.hadoop.hive.serde2.io.HiveDecimalWritable;
    +import org.apache.hadoop.hive.serde2.io.ShortWritable;
    +import org.apache.hadoop.hive.serde2.objectinspector.StructObjectInspector;
    +import org.apache.hadoop.io.*;
    +import org.junit.Test;
    +
    +import java.util.Properties;
    +
    +public class TestCarbonSerde extends TestCase {
    +  @Test
    +  public void testCarbonHiveSerDe() throws Throwable {
    +    try {
    +      // Create the SerDe
    +      System.out.println("test: testCarbonHiveSerDe");
    +
    +      final CarbonHiveSerDe serDe = new CarbonHiveSerDe();
    +      final Configuration conf = new Configuration();
    +      final Properties tbl = createProperties();
    +      SerDeUtils.initializeSerDe(serDe, conf, tbl, null);
    +
    +      // Data
    +      final Writable[] arr = new Writable[7];
    +
    +      //primitive types
    +      arr[0] = new ShortWritable((short) 456);
    +      arr[1] = new IntWritable(789);
    +      arr[2] = new LongWritable(1000l);
    +      arr[3] = new DoubleWritable((double) 5.3);
    +      arr[4] = new HiveDecimalWritable(HiveDecimal.create(1));
    +      arr[5] = new Text("carbonSerde binary".getBytes("UTF-8"));
    +
    +      final Writable[] arrayContainer = new Writable[1];
    +      final Writable[] array = new Writable[5];
    +      for (int i = 0; i < 5; ++i) {
    +        array[i] = new IntWritable(i);
    +      }
    +      arrayContainer[0] = new ArrayWritable(Writable.class, array);
    +      arr[6] = new ArrayWritable(Writable.class, arrayContainer);
    +
    +      final ArrayWritable arrWritable = new ArrayWritable(Writable.class, arr);
    +      // Test
    +      deserializeAndSerializeLazySimple(serDe, arrWritable);
    +      System.out.println("test: testCarbonHiveSerDe - OK");
    +
    +    } catch (final Throwable e) {
    +      e.printStackTrace();
    +      throw e;
    +    }
    +  }
    +
    +  private void deserializeAndSerializeLazySimple(final CarbonHiveSerDe serDe,
    +      final ArrayWritable t) throws SerDeException {
    +
    +    // Get the row structure
    +    final StructObjectInspector oi = (StructObjectInspector) serDe.getObjectInspector();
    +
    +    // Deserialize
    +    final Object row = serDe.deserialize(t);
    +    assertEquals("deserialization gives the wrong object class", row.getClass(),
    +        ArrayWritable.class);
    +    assertEquals("size correct after deserialization",
    +        serDe.getSerDeStats().getRawDataSize(), t.get().length);
    +    assertEquals("deserialization gives the wrong object", t, row);
    +
    +    // Serialize
    +    final ArrayWritable serializedArr = (ArrayWritable) serDe.serialize(row, oi);
    +    assertEquals("size correct after serialization", serDe.getSerDeStats().getRawDataSize(),
    +        serializedArr.get().length);
    +    assertTrue("serialized object should be equal to starting object",
    +        arrayWritableEquals(t, serializedArr));
    +  }
    +
    +  private Properties createProperties() {
    +    final Properties tbl = new Properties();
    +
    +    // Set the configuration parameters
    +    tbl.setProperty("columns", "ashort,aint,along,adouble,adecimal,astring,alist");
    --- End diff --
   
    Please take care the spell format, including the case sensitive, for example: aShort


---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata pull request #2969: [CARBONDATA-3127]Fix the TestCarbonSerde exce...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user xubo245 commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/2969#discussion_r238085516
 
    --- Diff: integration/hive/src/test/java/org/apache/carbondata/hive/TestCarbonSerde.java ---
    @@ -1,133 +1,133 @@
    -///*
    -// * Licensed to the Apache Software Foundation (ASF) under one or more
    -// * contributor license agreements.  See the NOTICE file distributed with
    -// * this work for additional information regarding copyright ownership.
    -// * The ASF licenses this file to You under the Apache License, Version 2.0
    -// * (the "License"); you may not use this file except in compliance with
    -// * the License.  You may obtain a copy of the License at
    -// *
    -// *    http://www.apache.org/licenses/LICENSE-2.0
    -// *
    -// * Unless required by applicable law or agreed to in writing, software
    -// * distributed under the License is distributed on an "AS IS" BASIS,
    -// * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    -// * See the License for the specific language governing permissions and
    -// * limitations under the License.
    -// */
    -//package org.apache.carbondata.hive;
    -//
    -//import junit.framework.TestCase;
    -//import org.apache.hadoop.conf.Configuration;
    -//import org.apache.hadoop.hive.common.type.HiveDecimal;
    -//import org.apache.hadoop.hive.serde2.SerDeException;
    -//import org.apache.hadoop.hive.serde2.SerDeUtils;
    -//import org.apache.hadoop.hive.serde2.io.DoubleWritable;
    -//import org.apache.hadoop.hive.serde2.io.HiveDecimalWritable;
    -//import org.apache.hadoop.hive.serde2.io.ShortWritable;
    -//import org.apache.hadoop.hive.serde2.objectinspector.StructObjectInspector;
    -//import org.apache.hadoop.io.*;
    -//import org.junit.Test;
    -//
    -//import java.util.Properties;
    -//
    -//public class TestCarbonSerde extends TestCase {
    -//  @Test
    -//  public void testCarbonHiveSerDe() throws Throwable {
    -//    try {
    -//      // Create the SerDe
    -//      System.out.println("test: testCarbonHiveSerDe");
    -//
    -//      final CarbonHiveSerDe serDe = new CarbonHiveSerDe();
    -//      final Configuration conf = new Configuration();
    -//      final Properties tbl = createProperties();
    -//      SerDeUtils.initializeSerDe(serDe, conf, tbl, null);
    -//
    -//      // Data
    -//      final Writable[] arr = new Writable[7];
    -//
    -//      //primitive types
    -//      arr[0] = new ShortWritable((short) 456);
    -//      arr[1] = new IntWritable(789);
    -//      arr[2] = new LongWritable(1000l);
    -//      arr[3] = new DoubleWritable((double) 5.3);
    -//      arr[4] = new HiveDecimalWritable(HiveDecimal.create(1));
    -//      arr[5] = new Text("carbonSerde binary".getBytes("UTF-8"));
    -//
    -//      final Writable[] arrayContainer = new Writable[1];
    -//      final Writable[] array = new Writable[5];
    -//      for (int i = 0; i < 5; ++i) {
    -//        array[i] = new IntWritable(i);
    -//      }
    -//      arrayContainer[0] = new ArrayWritable(Writable.class, array);
    -//      arr[6] = new ArrayWritable(Writable.class, arrayContainer);
    -//
    -//      final ArrayWritable arrWritable = new ArrayWritable(Writable.class, arr);
    -//      // Test
    -//      deserializeAndSerializeLazySimple(serDe, arrWritable);
    -//      System.out.println("test: testCarbonHiveSerDe - OK");
    -//
    -//    } catch (final Throwable e) {
    -//      e.printStackTrace();
    -//      throw e;
    -//    }
    -//  }
    -//
    -//  private void deserializeAndSerializeLazySimple(final CarbonHiveSerDe serDe,
    -//      final ArrayWritable t) throws SerDeException {
    -//
    -//    // Get the row structure
    -//    final StructObjectInspector oi = (StructObjectInspector) serDe.getObjectInspector();
    -//
    -//    // Deserialize
    -//    final Object row = serDe.deserialize(t);
    -//    assertEquals("deserialization gives the wrong object class", row.getClass(),
    -//        ArrayWritable.class);
    -//    assertEquals("size correct after deserialization",
    -//        serDe.getSerDeStats().getRawDataSize(), t.get().length);
    -//    assertEquals("deserialization gives the wrong object", t, row);
    -//
    -//    // Serialize
    -//    final ArrayWritable serializedArr = (ArrayWritable) serDe.serializeStartKey(row, oi);
    -//    assertEquals("size correct after serialization", serDe.getSerDeStats().getRawDataSize(),
    -//        serializedArr.get().length);
    -//    assertTrue("serialized object should be equal to starting object",
    -//        arrayWritableEquals(t, serializedArr));
    -//  }
    -//
    -//  private Properties createProperties() {
    -//    final Properties tbl = new Properties();
    -//
    -//    // Set the configuration parameters
    -//    tbl.setProperty("columns", "ashort,aint,along,adouble,adecimal,astring,alist");
    -//    tbl.setProperty("columns.types",
    -//        "smallint:int:bigint:double:decimal:string:array<int>");
    -//    tbl.setProperty(org.apache.hadoop.hive.serde.serdeConstants.SERIALIZATION_NULL_FORMAT, "NULL");
    -//    return tbl;
    -//  }
    -//
    -//  public static boolean arrayWritableEquals(final ArrayWritable a1, final ArrayWritable a2) {
    -//    final Writable[] a1Arr = a1.get();
    -//    final Writable[] a2Arr = a2.get();
    -//
    -//    if (a1Arr.length != a2Arr.length) {
    -//      return false;
    -//    }
    -//
    -//    for (int i = 0; i < a1Arr.length; ++i) {
    -//      if (a1Arr[i] instanceof ArrayWritable) {
    -//        if (!(a2Arr[i] instanceof ArrayWritable)) {
    -//          return false;
    -//        }
    -//        if (!arrayWritableEquals((ArrayWritable) a1Arr[i], (ArrayWritable) a2Arr[i])) {
    -//          return false;
    -//        }
    -//      } else {
    -//        if (!a1Arr[i].equals(a2Arr[i])) {
    -//          return false;
    -//        }
    -//      }
    -//
    -//    }
    -//    return true;
    -//  }
    -//}
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *    http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +package org.apache.carbondata.hive;
    +
    +import junit.framework.TestCase;
    +import org.apache.hadoop.conf.Configuration;
    +import org.apache.hadoop.hive.common.type.HiveDecimal;
    +import org.apache.hadoop.hive.serde2.SerDeException;
    +import org.apache.hadoop.hive.serde2.SerDeUtils;
    +import org.apache.hadoop.hive.serde2.io.DoubleWritable;
    +import org.apache.hadoop.hive.serde2.io.HiveDecimalWritable;
    +import org.apache.hadoop.hive.serde2.io.ShortWritable;
    +import org.apache.hadoop.hive.serde2.objectinspector.StructObjectInspector;
    +import org.apache.hadoop.io.*;
    +import org.junit.Test;
    +
    +import java.util.Properties;
    +
    +public class TestCarbonSerde extends TestCase {
    --- End diff --
   
    Please take care the spell format, including the case sensitive,for example:TestCarbonSerDe
   
    Please check other code


---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata pull request #2969: [CARBONDATA-3127]Fix the TestCarbonSerde exce...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user xubo245 commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/2969#discussion_r238085518
 
    --- Diff: integration/hive/src/test/java/org/apache/carbondata/hive/TestCarbonSerde.java ---
    @@ -1,133 +1,133 @@
    -///*
    -// * Licensed to the Apache Software Foundation (ASF) under one or more
    -// * contributor license agreements.  See the NOTICE file distributed with
    -// * this work for additional information regarding copyright ownership.
    -// * The ASF licenses this file to You under the Apache License, Version 2.0
    -// * (the "License"); you may not use this file except in compliance with
    -// * the License.  You may obtain a copy of the License at
    -// *
    -// *    http://www.apache.org/licenses/LICENSE-2.0
    -// *
    -// * Unless required by applicable law or agreed to in writing, software
    -// * distributed under the License is distributed on an "AS IS" BASIS,
    -// * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    -// * See the License for the specific language governing permissions and
    -// * limitations under the License.
    -// */
    -//package org.apache.carbondata.hive;
    -//
    -//import junit.framework.TestCase;
    -//import org.apache.hadoop.conf.Configuration;
    -//import org.apache.hadoop.hive.common.type.HiveDecimal;
    -//import org.apache.hadoop.hive.serde2.SerDeException;
    -//import org.apache.hadoop.hive.serde2.SerDeUtils;
    -//import org.apache.hadoop.hive.serde2.io.DoubleWritable;
    -//import org.apache.hadoop.hive.serde2.io.HiveDecimalWritable;
    -//import org.apache.hadoop.hive.serde2.io.ShortWritable;
    -//import org.apache.hadoop.hive.serde2.objectinspector.StructObjectInspector;
    -//import org.apache.hadoop.io.*;
    -//import org.junit.Test;
    -//
    -//import java.util.Properties;
    -//
    -//public class TestCarbonSerde extends TestCase {
    -//  @Test
    -//  public void testCarbonHiveSerDe() throws Throwable {
    -//    try {
    -//      // Create the SerDe
    -//      System.out.println("test: testCarbonHiveSerDe");
    -//
    -//      final CarbonHiveSerDe serDe = new CarbonHiveSerDe();
    -//      final Configuration conf = new Configuration();
    -//      final Properties tbl = createProperties();
    -//      SerDeUtils.initializeSerDe(serDe, conf, tbl, null);
    -//
    -//      // Data
    -//      final Writable[] arr = new Writable[7];
    -//
    -//      //primitive types
    -//      arr[0] = new ShortWritable((short) 456);
    -//      arr[1] = new IntWritable(789);
    -//      arr[2] = new LongWritable(1000l);
    -//      arr[3] = new DoubleWritable((double) 5.3);
    -//      arr[4] = new HiveDecimalWritable(HiveDecimal.create(1));
    -//      arr[5] = new Text("carbonSerde binary".getBytes("UTF-8"));
    -//
    -//      final Writable[] arrayContainer = new Writable[1];
    -//      final Writable[] array = new Writable[5];
    -//      for (int i = 0; i < 5; ++i) {
    -//        array[i] = new IntWritable(i);
    -//      }
    -//      arrayContainer[0] = new ArrayWritable(Writable.class, array);
    -//      arr[6] = new ArrayWritable(Writable.class, arrayContainer);
    -//
    -//      final ArrayWritable arrWritable = new ArrayWritable(Writable.class, arr);
    -//      // Test
    -//      deserializeAndSerializeLazySimple(serDe, arrWritable);
    -//      System.out.println("test: testCarbonHiveSerDe - OK");
    -//
    -//    } catch (final Throwable e) {
    -//      e.printStackTrace();
    -//      throw e;
    -//    }
    -//  }
    -//
    -//  private void deserializeAndSerializeLazySimple(final CarbonHiveSerDe serDe,
    -//      final ArrayWritable t) throws SerDeException {
    -//
    -//    // Get the row structure
    -//    final StructObjectInspector oi = (StructObjectInspector) serDe.getObjectInspector();
    -//
    -//    // Deserialize
    -//    final Object row = serDe.deserialize(t);
    -//    assertEquals("deserialization gives the wrong object class", row.getClass(),
    -//        ArrayWritable.class);
    -//    assertEquals("size correct after deserialization",
    -//        serDe.getSerDeStats().getRawDataSize(), t.get().length);
    -//    assertEquals("deserialization gives the wrong object", t, row);
    -//
    -//    // Serialize
    -//    final ArrayWritable serializedArr = (ArrayWritable) serDe.serializeStartKey(row, oi);
    -//    assertEquals("size correct after serialization", serDe.getSerDeStats().getRawDataSize(),
    -//        serializedArr.get().length);
    -//    assertTrue("serialized object should be equal to starting object",
    -//        arrayWritableEquals(t, serializedArr));
    -//  }
    -//
    -//  private Properties createProperties() {
    -//    final Properties tbl = new Properties();
    -//
    -//    // Set the configuration parameters
    -//    tbl.setProperty("columns", "ashort,aint,along,adouble,adecimal,astring,alist");
    -//    tbl.setProperty("columns.types",
    -//        "smallint:int:bigint:double:decimal:string:array<int>");
    -//    tbl.setProperty(org.apache.hadoop.hive.serde.serdeConstants.SERIALIZATION_NULL_FORMAT, "NULL");
    -//    return tbl;
    -//  }
    -//
    -//  public static boolean arrayWritableEquals(final ArrayWritable a1, final ArrayWritable a2) {
    -//    final Writable[] a1Arr = a1.get();
    -//    final Writable[] a2Arr = a2.get();
    -//
    -//    if (a1Arr.length != a2Arr.length) {
    -//      return false;
    -//    }
    -//
    -//    for (int i = 0; i < a1Arr.length; ++i) {
    -//      if (a1Arr[i] instanceof ArrayWritable) {
    -//        if (!(a2Arr[i] instanceof ArrayWritable)) {
    -//          return false;
    -//        }
    -//        if (!arrayWritableEquals((ArrayWritable) a1Arr[i], (ArrayWritable) a2Arr[i])) {
    -//          return false;
    -//        }
    -//      } else {
    -//        if (!a1Arr[i].equals(a2Arr[i])) {
    -//          return false;
    -//        }
    -//      }
    -//
    -//    }
    -//    return true;
    -//  }
    -//}
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *    http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +package org.apache.carbondata.hive;
    +
    +import junit.framework.TestCase;
    +import org.apache.hadoop.conf.Configuration;
    +import org.apache.hadoop.hive.common.type.HiveDecimal;
    +import org.apache.hadoop.hive.serde2.SerDeException;
    +import org.apache.hadoop.hive.serde2.SerDeUtils;
    +import org.apache.hadoop.hive.serde2.io.DoubleWritable;
    +import org.apache.hadoop.hive.serde2.io.HiveDecimalWritable;
    +import org.apache.hadoop.hive.serde2.io.ShortWritable;
    +import org.apache.hadoop.hive.serde2.objectinspector.StructObjectInspector;
    +import org.apache.hadoop.io.*;
    +import org.junit.Test;
    +
    +import java.util.Properties;
    +
    +public class TestCarbonSerde extends TestCase {
    +  @Test
    +  public void testCarbonHiveSerDe() throws Throwable {
    +    try {
    +      // Create the SerDe
    +      System.out.println("test: testCarbonHiveSerDe");
    +
    +      final CarbonHiveSerDe serDe = new CarbonHiveSerDe();
    +      final Configuration conf = new Configuration();
    +      final Properties tbl = createProperties();
    +      SerDeUtils.initializeSerDe(serDe, conf, tbl, null);
    +
    +      // Data
    +      final Writable[] arr = new Writable[7];
    +
    +      //primitive types
    +      arr[0] = new ShortWritable((short) 456);
    +      arr[1] = new IntWritable(789);
    +      arr[2] = new LongWritable(1000l);
    +      arr[3] = new DoubleWritable((double) 5.3);
    --- End diff --
   
    no need: (double) ,please remove it


---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata issue #2969: [CARBONDATA-3127]Fix the TestCarbonSerde exception

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user SteNicholas commented on the issue:

    https://github.com/apache/carbondata/pull/2969
 
    @zzcclp Please review this update;


---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata issue #2969: [CARBONDATA-3127]Fix the TestCarbonSerde exception

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:

    https://github.com/apache/carbondata/pull/2969
 
    Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder2.1/1610/



---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata issue #2969: [CARBONDATA-3127]Fix the TestCarbonSerde exception

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:

    https://github.com/apache/carbondata/pull/2969
 
    Build Success with Spark 2.3.1, Please check CI http://136.243.101.176:8080/job/carbondataprbuilder2.3/9870/



---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata issue #2969: [CARBONDATA-3127]Fix the TestCarbonSerde exception

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:

    https://github.com/apache/carbondata/pull/2969
 
    Build Success with Spark 2.2.1, Please check CI http://95.216.28.178:8080/job/ApacheCarbonPRBuilder1/1821/



---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata pull request #2969: [CARBONDATA-3127]Fix the TestCarbonSerde exce...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user SteNicholas commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/2969#discussion_r238089109
 
    --- Diff: integration/hive/src/test/java/org/apache/carbondata/hive/TestCarbonSerde.java ---
    @@ -1,133 +1,133 @@
    -///*
    -// * Licensed to the Apache Software Foundation (ASF) under one or more
    -// * contributor license agreements.  See the NOTICE file distributed with
    -// * this work for additional information regarding copyright ownership.
    -// * The ASF licenses this file to You under the Apache License, Version 2.0
    -// * (the "License"); you may not use this file except in compliance with
    -// * the License.  You may obtain a copy of the License at
    -// *
    -// *    http://www.apache.org/licenses/LICENSE-2.0
    -// *
    -// * Unless required by applicable law or agreed to in writing, software
    -// * distributed under the License is distributed on an "AS IS" BASIS,
    -// * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    -// * See the License for the specific language governing permissions and
    -// * limitations under the License.
    -// */
    -//package org.apache.carbondata.hive;
    -//
    -//import junit.framework.TestCase;
    -//import org.apache.hadoop.conf.Configuration;
    -//import org.apache.hadoop.hive.common.type.HiveDecimal;
    -//import org.apache.hadoop.hive.serde2.SerDeException;
    -//import org.apache.hadoop.hive.serde2.SerDeUtils;
    -//import org.apache.hadoop.hive.serde2.io.DoubleWritable;
    -//import org.apache.hadoop.hive.serde2.io.HiveDecimalWritable;
    -//import org.apache.hadoop.hive.serde2.io.ShortWritable;
    -//import org.apache.hadoop.hive.serde2.objectinspector.StructObjectInspector;
    -//import org.apache.hadoop.io.*;
    -//import org.junit.Test;
    -//
    -//import java.util.Properties;
    -//
    -//public class TestCarbonSerde extends TestCase {
    -//  @Test
    -//  public void testCarbonHiveSerDe() throws Throwable {
    -//    try {
    -//      // Create the SerDe
    -//      System.out.println("test: testCarbonHiveSerDe");
    -//
    -//      final CarbonHiveSerDe serDe = new CarbonHiveSerDe();
    -//      final Configuration conf = new Configuration();
    -//      final Properties tbl = createProperties();
    -//      SerDeUtils.initializeSerDe(serDe, conf, tbl, null);
    -//
    -//      // Data
    -//      final Writable[] arr = new Writable[7];
    -//
    -//      //primitive types
    -//      arr[0] = new ShortWritable((short) 456);
    -//      arr[1] = new IntWritable(789);
    -//      arr[2] = new LongWritable(1000l);
    -//      arr[3] = new DoubleWritable((double) 5.3);
    -//      arr[4] = new HiveDecimalWritable(HiveDecimal.create(1));
    -//      arr[5] = new Text("carbonSerde binary".getBytes("UTF-8"));
    -//
    -//      final Writable[] arrayContainer = new Writable[1];
    -//      final Writable[] array = new Writable[5];
    -//      for (int i = 0; i < 5; ++i) {
    -//        array[i] = new IntWritable(i);
    -//      }
    -//      arrayContainer[0] = new ArrayWritable(Writable.class, array);
    -//      arr[6] = new ArrayWritable(Writable.class, arrayContainer);
    -//
    -//      final ArrayWritable arrWritable = new ArrayWritable(Writable.class, arr);
    -//      // Test
    -//      deserializeAndSerializeLazySimple(serDe, arrWritable);
    -//      System.out.println("test: testCarbonHiveSerDe - OK");
    -//
    -//    } catch (final Throwable e) {
    -//      e.printStackTrace();
    -//      throw e;
    -//    }
    -//  }
    -//
    -//  private void deserializeAndSerializeLazySimple(final CarbonHiveSerDe serDe,
    -//      final ArrayWritable t) throws SerDeException {
    -//
    -//    // Get the row structure
    -//    final StructObjectInspector oi = (StructObjectInspector) serDe.getObjectInspector();
    -//
    -//    // Deserialize
    -//    final Object row = serDe.deserialize(t);
    -//    assertEquals("deserialization gives the wrong object class", row.getClass(),
    -//        ArrayWritable.class);
    -//    assertEquals("size correct after deserialization",
    -//        serDe.getSerDeStats().getRawDataSize(), t.get().length);
    -//    assertEquals("deserialization gives the wrong object", t, row);
    -//
    -//    // Serialize
    -//    final ArrayWritable serializedArr = (ArrayWritable) serDe.serializeStartKey(row, oi);
    -//    assertEquals("size correct after serialization", serDe.getSerDeStats().getRawDataSize(),
    -//        serializedArr.get().length);
    -//    assertTrue("serialized object should be equal to starting object",
    -//        arrayWritableEquals(t, serializedArr));
    -//  }
    -//
    -//  private Properties createProperties() {
    -//    final Properties tbl = new Properties();
    -//
    -//    // Set the configuration parameters
    -//    tbl.setProperty("columns", "ashort,aint,along,adouble,adecimal,astring,alist");
    -//    tbl.setProperty("columns.types",
    -//        "smallint:int:bigint:double:decimal:string:array<int>");
    -//    tbl.setProperty(org.apache.hadoop.hive.serde.serdeConstants.SERIALIZATION_NULL_FORMAT, "NULL");
    -//    return tbl;
    -//  }
    -//
    -//  public static boolean arrayWritableEquals(final ArrayWritable a1, final ArrayWritable a2) {
    -//    final Writable[] a1Arr = a1.get();
    -//    final Writable[] a2Arr = a2.get();
    -//
    -//    if (a1Arr.length != a2Arr.length) {
    -//      return false;
    -//    }
    -//
    -//    for (int i = 0; i < a1Arr.length; ++i) {
    -//      if (a1Arr[i] instanceof ArrayWritable) {
    -//        if (!(a2Arr[i] instanceof ArrayWritable)) {
    -//          return false;
    -//        }
    -//        if (!arrayWritableEquals((ArrayWritable) a1Arr[i], (ArrayWritable) a2Arr[i])) {
    -//          return false;
    -//        }
    -//      } else {
    -//        if (!a1Arr[i].equals(a2Arr[i])) {
    -//          return false;
    -//        }
    -//      }
    -//
    -//    }
    -//    return true;
    -//  }
    -//}
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *    http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +package org.apache.carbondata.hive;
    +
    +import junit.framework.TestCase;
    +import org.apache.hadoop.conf.Configuration;
    +import org.apache.hadoop.hive.common.type.HiveDecimal;
    +import org.apache.hadoop.hive.serde2.SerDeException;
    +import org.apache.hadoop.hive.serde2.SerDeUtils;
    +import org.apache.hadoop.hive.serde2.io.DoubleWritable;
    +import org.apache.hadoop.hive.serde2.io.HiveDecimalWritable;
    +import org.apache.hadoop.hive.serde2.io.ShortWritable;
    +import org.apache.hadoop.hive.serde2.objectinspector.StructObjectInspector;
    +import org.apache.hadoop.io.*;
    +import org.junit.Test;
    +
    +import java.util.Properties;
    +
    +public class TestCarbonSerde extends TestCase {
    +  @Test
    +  public void testCarbonHiveSerDe() throws Throwable {
    +    try {
    +      // Create the SerDe
    +      System.out.println("test: testCarbonHiveSerDe");
    +
    +      final CarbonHiveSerDe serDe = new CarbonHiveSerDe();
    +      final Configuration conf = new Configuration();
    +      final Properties tbl = createProperties();
    +      SerDeUtils.initializeSerDe(serDe, conf, tbl, null);
    +
    +      // Data
    +      final Writable[] arr = new Writable[7];
    +
    +      //primitive types
    +      arr[0] = new ShortWritable((short) 456);
    +      arr[1] = new IntWritable(789);
    +      arr[2] = new LongWritable(1000l);
    +      arr[3] = new DoubleWritable((double) 5.3);
    --- End diff --
   
    @xubo245 I have already update this transformation.


---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata pull request #2969: [CARBONDATA-3127]Fix the TestCarbonSerde exce...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user SteNicholas commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/2969#discussion_r238089137
 
    --- Diff: integration/hive/src/test/java/org/apache/carbondata/hive/TestCarbonSerde.java ---
    @@ -1,133 +1,133 @@
    -///*
    -// * Licensed to the Apache Software Foundation (ASF) under one or more
    -// * contributor license agreements.  See the NOTICE file distributed with
    -// * this work for additional information regarding copyright ownership.
    -// * The ASF licenses this file to You under the Apache License, Version 2.0
    -// * (the "License"); you may not use this file except in compliance with
    -// * the License.  You may obtain a copy of the License at
    -// *
    -// *    http://www.apache.org/licenses/LICENSE-2.0
    -// *
    -// * Unless required by applicable law or agreed to in writing, software
    -// * distributed under the License is distributed on an "AS IS" BASIS,
    -// * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    -// * See the License for the specific language governing permissions and
    -// * limitations under the License.
    -// */
    -//package org.apache.carbondata.hive;
    -//
    -//import junit.framework.TestCase;
    -//import org.apache.hadoop.conf.Configuration;
    -//import org.apache.hadoop.hive.common.type.HiveDecimal;
    -//import org.apache.hadoop.hive.serde2.SerDeException;
    -//import org.apache.hadoop.hive.serde2.SerDeUtils;
    -//import org.apache.hadoop.hive.serde2.io.DoubleWritable;
    -//import org.apache.hadoop.hive.serde2.io.HiveDecimalWritable;
    -//import org.apache.hadoop.hive.serde2.io.ShortWritable;
    -//import org.apache.hadoop.hive.serde2.objectinspector.StructObjectInspector;
    -//import org.apache.hadoop.io.*;
    -//import org.junit.Test;
    -//
    -//import java.util.Properties;
    -//
    -//public class TestCarbonSerde extends TestCase {
    -//  @Test
    -//  public void testCarbonHiveSerDe() throws Throwable {
    -//    try {
    -//      // Create the SerDe
    -//      System.out.println("test: testCarbonHiveSerDe");
    -//
    -//      final CarbonHiveSerDe serDe = new CarbonHiveSerDe();
    -//      final Configuration conf = new Configuration();
    -//      final Properties tbl = createProperties();
    -//      SerDeUtils.initializeSerDe(serDe, conf, tbl, null);
    -//
    -//      // Data
    -//      final Writable[] arr = new Writable[7];
    -//
    -//      //primitive types
    -//      arr[0] = new ShortWritable((short) 456);
    -//      arr[1] = new IntWritable(789);
    -//      arr[2] = new LongWritable(1000l);
    -//      arr[3] = new DoubleWritable((double) 5.3);
    -//      arr[4] = new HiveDecimalWritable(HiveDecimal.create(1));
    -//      arr[5] = new Text("carbonSerde binary".getBytes("UTF-8"));
    -//
    -//      final Writable[] arrayContainer = new Writable[1];
    -//      final Writable[] array = new Writable[5];
    -//      for (int i = 0; i < 5; ++i) {
    -//        array[i] = new IntWritable(i);
    -//      }
    -//      arrayContainer[0] = new ArrayWritable(Writable.class, array);
    -//      arr[6] = new ArrayWritable(Writable.class, arrayContainer);
    -//
    -//      final ArrayWritable arrWritable = new ArrayWritable(Writable.class, arr);
    -//      // Test
    -//      deserializeAndSerializeLazySimple(serDe, arrWritable);
    -//      System.out.println("test: testCarbonHiveSerDe - OK");
    -//
    -//    } catch (final Throwable e) {
    -//      e.printStackTrace();
    -//      throw e;
    -//    }
    -//  }
    -//
    -//  private void deserializeAndSerializeLazySimple(final CarbonHiveSerDe serDe,
    -//      final ArrayWritable t) throws SerDeException {
    -//
    -//    // Get the row structure
    -//    final StructObjectInspector oi = (StructObjectInspector) serDe.getObjectInspector();
    -//
    -//    // Deserialize
    -//    final Object row = serDe.deserialize(t);
    -//    assertEquals("deserialization gives the wrong object class", row.getClass(),
    -//        ArrayWritable.class);
    -//    assertEquals("size correct after deserialization",
    -//        serDe.getSerDeStats().getRawDataSize(), t.get().length);
    -//    assertEquals("deserialization gives the wrong object", t, row);
    -//
    -//    // Serialize
    -//    final ArrayWritable serializedArr = (ArrayWritable) serDe.serializeStartKey(row, oi);
    -//    assertEquals("size correct after serialization", serDe.getSerDeStats().getRawDataSize(),
    -//        serializedArr.get().length);
    -//    assertTrue("serialized object should be equal to starting object",
    -//        arrayWritableEquals(t, serializedArr));
    -//  }
    -//
    -//  private Properties createProperties() {
    -//    final Properties tbl = new Properties();
    -//
    -//    // Set the configuration parameters
    -//    tbl.setProperty("columns", "ashort,aint,along,adouble,adecimal,astring,alist");
    -//    tbl.setProperty("columns.types",
    -//        "smallint:int:bigint:double:decimal:string:array<int>");
    -//    tbl.setProperty(org.apache.hadoop.hive.serde.serdeConstants.SERIALIZATION_NULL_FORMAT, "NULL");
    -//    return tbl;
    -//  }
    -//
    -//  public static boolean arrayWritableEquals(final ArrayWritable a1, final ArrayWritable a2) {
    -//    final Writable[] a1Arr = a1.get();
    -//    final Writable[] a2Arr = a2.get();
    -//
    -//    if (a1Arr.length != a2Arr.length) {
    -//      return false;
    -//    }
    -//
    -//    for (int i = 0; i < a1Arr.length; ++i) {
    -//      if (a1Arr[i] instanceof ArrayWritable) {
    -//        if (!(a2Arr[i] instanceof ArrayWritable)) {
    -//          return false;
    -//        }
    -//        if (!arrayWritableEquals((ArrayWritable) a1Arr[i], (ArrayWritable) a2Arr[i])) {
    -//          return false;
    -//        }
    -//      } else {
    -//        if (!a1Arr[i].equals(a2Arr[i])) {
    -//          return false;
    -//        }
    -//      }
    -//
    -//    }
    -//    return true;
    -//  }
    -//}
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *    http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +package org.apache.carbondata.hive;
    +
    +import junit.framework.TestCase;
    +import org.apache.hadoop.conf.Configuration;
    +import org.apache.hadoop.hive.common.type.HiveDecimal;
    +import org.apache.hadoop.hive.serde2.SerDeException;
    +import org.apache.hadoop.hive.serde2.SerDeUtils;
    +import org.apache.hadoop.hive.serde2.io.DoubleWritable;
    +import org.apache.hadoop.hive.serde2.io.HiveDecimalWritable;
    +import org.apache.hadoop.hive.serde2.io.ShortWritable;
    +import org.apache.hadoop.hive.serde2.objectinspector.StructObjectInspector;
    +import org.apache.hadoop.io.*;
    +import org.junit.Test;
    +
    +import java.util.Properties;
    +
    +public class TestCarbonSerde extends TestCase {
    --- End diff --
   
    @xubo245 I have already fixed the spell formate you referred.


---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata pull request #2969: [CARBONDATA-3127]Fix the TestCarbonSerde exce...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user SteNicholas commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/2969#discussion_r238089149
 
    --- Diff: integration/hive/src/test/java/org/apache/carbondata/hive/TestCarbonSerde.java ---
    @@ -1,133 +1,133 @@
    -///*
    -// * Licensed to the Apache Software Foundation (ASF) under one or more
    -// * contributor license agreements.  See the NOTICE file distributed with
    -// * this work for additional information regarding copyright ownership.
    -// * The ASF licenses this file to You under the Apache License, Version 2.0
    -// * (the "License"); you may not use this file except in compliance with
    -// * the License.  You may obtain a copy of the License at
    -// *
    -// *    http://www.apache.org/licenses/LICENSE-2.0
    -// *
    -// * Unless required by applicable law or agreed to in writing, software
    -// * distributed under the License is distributed on an "AS IS" BASIS,
    -// * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    -// * See the License for the specific language governing permissions and
    -// * limitations under the License.
    -// */
    -//package org.apache.carbondata.hive;
    -//
    -//import junit.framework.TestCase;
    -//import org.apache.hadoop.conf.Configuration;
    -//import org.apache.hadoop.hive.common.type.HiveDecimal;
    -//import org.apache.hadoop.hive.serde2.SerDeException;
    -//import org.apache.hadoop.hive.serde2.SerDeUtils;
    -//import org.apache.hadoop.hive.serde2.io.DoubleWritable;
    -//import org.apache.hadoop.hive.serde2.io.HiveDecimalWritable;
    -//import org.apache.hadoop.hive.serde2.io.ShortWritable;
    -//import org.apache.hadoop.hive.serde2.objectinspector.StructObjectInspector;
    -//import org.apache.hadoop.io.*;
    -//import org.junit.Test;
    -//
    -//import java.util.Properties;
    -//
    -//public class TestCarbonSerde extends TestCase {
    -//  @Test
    -//  public void testCarbonHiveSerDe() throws Throwable {
    -//    try {
    -//      // Create the SerDe
    -//      System.out.println("test: testCarbonHiveSerDe");
    -//
    -//      final CarbonHiveSerDe serDe = new CarbonHiveSerDe();
    -//      final Configuration conf = new Configuration();
    -//      final Properties tbl = createProperties();
    -//      SerDeUtils.initializeSerDe(serDe, conf, tbl, null);
    -//
    -//      // Data
    -//      final Writable[] arr = new Writable[7];
    -//
    -//      //primitive types
    -//      arr[0] = new ShortWritable((short) 456);
    -//      arr[1] = new IntWritable(789);
    -//      arr[2] = new LongWritable(1000l);
    -//      arr[3] = new DoubleWritable((double) 5.3);
    -//      arr[4] = new HiveDecimalWritable(HiveDecimal.create(1));
    -//      arr[5] = new Text("carbonSerde binary".getBytes("UTF-8"));
    -//
    -//      final Writable[] arrayContainer = new Writable[1];
    -//      final Writable[] array = new Writable[5];
    -//      for (int i = 0; i < 5; ++i) {
    -//        array[i] = new IntWritable(i);
    -//      }
    -//      arrayContainer[0] = new ArrayWritable(Writable.class, array);
    -//      arr[6] = new ArrayWritable(Writable.class, arrayContainer);
    -//
    -//      final ArrayWritable arrWritable = new ArrayWritable(Writable.class, arr);
    -//      // Test
    -//      deserializeAndSerializeLazySimple(serDe, arrWritable);
    -//      System.out.println("test: testCarbonHiveSerDe - OK");
    -//
    -//    } catch (final Throwable e) {
    -//      e.printStackTrace();
    -//      throw e;
    -//    }
    -//  }
    -//
    -//  private void deserializeAndSerializeLazySimple(final CarbonHiveSerDe serDe,
    -//      final ArrayWritable t) throws SerDeException {
    -//
    -//    // Get the row structure
    -//    final StructObjectInspector oi = (StructObjectInspector) serDe.getObjectInspector();
    -//
    -//    // Deserialize
    -//    final Object row = serDe.deserialize(t);
    -//    assertEquals("deserialization gives the wrong object class", row.getClass(),
    -//        ArrayWritable.class);
    -//    assertEquals("size correct after deserialization",
    -//        serDe.getSerDeStats().getRawDataSize(), t.get().length);
    -//    assertEquals("deserialization gives the wrong object", t, row);
    -//
    -//    // Serialize
    -//    final ArrayWritable serializedArr = (ArrayWritable) serDe.serializeStartKey(row, oi);
    -//    assertEquals("size correct after serialization", serDe.getSerDeStats().getRawDataSize(),
    -//        serializedArr.get().length);
    -//    assertTrue("serialized object should be equal to starting object",
    -//        arrayWritableEquals(t, serializedArr));
    -//  }
    -//
    -//  private Properties createProperties() {
    -//    final Properties tbl = new Properties();
    -//
    -//    // Set the configuration parameters
    -//    tbl.setProperty("columns", "ashort,aint,along,adouble,adecimal,astring,alist");
    -//    tbl.setProperty("columns.types",
    -//        "smallint:int:bigint:double:decimal:string:array<int>");
    -//    tbl.setProperty(org.apache.hadoop.hive.serde.serdeConstants.SERIALIZATION_NULL_FORMAT, "NULL");
    -//    return tbl;
    -//  }
    -//
    -//  public static boolean arrayWritableEquals(final ArrayWritable a1, final ArrayWritable a2) {
    -//    final Writable[] a1Arr = a1.get();
    -//    final Writable[] a2Arr = a2.get();
    -//
    -//    if (a1Arr.length != a2Arr.length) {
    -//      return false;
    -//    }
    -//
    -//    for (int i = 0; i < a1Arr.length; ++i) {
    -//      if (a1Arr[i] instanceof ArrayWritable) {
    -//        if (!(a2Arr[i] instanceof ArrayWritable)) {
    -//          return false;
    -//        }
    -//        if (!arrayWritableEquals((ArrayWritable) a1Arr[i], (ArrayWritable) a2Arr[i])) {
    -//          return false;
    -//        }
    -//      } else {
    -//        if (!a1Arr[i].equals(a2Arr[i])) {
    -//          return false;
    -//        }
    -//      }
    -//
    -//    }
    -//    return true;
    -//  }
    -//}
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *    http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +package org.apache.carbondata.hive;
    +
    +import junit.framework.TestCase;
    +import org.apache.hadoop.conf.Configuration;
    +import org.apache.hadoop.hive.common.type.HiveDecimal;
    +import org.apache.hadoop.hive.serde2.SerDeException;
    +import org.apache.hadoop.hive.serde2.SerDeUtils;
    +import org.apache.hadoop.hive.serde2.io.DoubleWritable;
    +import org.apache.hadoop.hive.serde2.io.HiveDecimalWritable;
    +import org.apache.hadoop.hive.serde2.io.ShortWritable;
    +import org.apache.hadoop.hive.serde2.objectinspector.StructObjectInspector;
    +import org.apache.hadoop.io.*;
    +import org.junit.Test;
    +
    +import java.util.Properties;
    +
    +public class TestCarbonSerde extends TestCase {
    +  @Test
    +  public void testCarbonHiveSerDe() throws Throwable {
    +    try {
    +      // Create the SerDe
    +      System.out.println("test: testCarbonHiveSerDe");
    +
    +      final CarbonHiveSerDe serDe = new CarbonHiveSerDe();
    +      final Configuration conf = new Configuration();
    +      final Properties tbl = createProperties();
    +      SerDeUtils.initializeSerDe(serDe, conf, tbl, null);
    +
    +      // Data
    +      final Writable[] arr = new Writable[7];
    +
    +      //primitive types
    +      arr[0] = new ShortWritable((short) 456);
    +      arr[1] = new IntWritable(789);
    +      arr[2] = new LongWritable(1000l);
    +      arr[3] = new DoubleWritable((double) 5.3);
    +      arr[4] = new HiveDecimalWritable(HiveDecimal.create(1));
    +      arr[5] = new Text("carbonSerde binary".getBytes("UTF-8"));
    +
    +      final Writable[] arrayContainer = new Writable[1];
    +      final Writable[] array = new Writable[5];
    +      for (int i = 0; i < 5; ++i) {
    +        array[i] = new IntWritable(i);
    +      }
    +      arrayContainer[0] = new ArrayWritable(Writable.class, array);
    +      arr[6] = new ArrayWritable(Writable.class, arrayContainer);
    +
    +      final ArrayWritable arrWritable = new ArrayWritable(Writable.class, arr);
    +      // Test
    +      deserializeAndSerializeLazySimple(serDe, arrWritable);
    +      System.out.println("test: testCarbonHiveSerDe - OK");
    +
    +    } catch (final Throwable e) {
    +      e.printStackTrace();
    +      throw e;
    +    }
    +  }
    +
    +  private void deserializeAndSerializeLazySimple(final CarbonHiveSerDe serDe,
    +      final ArrayWritable t) throws SerDeException {
    +
    +    // Get the row structure
    +    final StructObjectInspector oi = (StructObjectInspector) serDe.getObjectInspector();
    +
    +    // Deserialize
    +    final Object row = serDe.deserialize(t);
    +    assertEquals("deserialization gives the wrong object class", row.getClass(),
    +        ArrayWritable.class);
    +    assertEquals("size correct after deserialization",
    +        serDe.getSerDeStats().getRawDataSize(), t.get().length);
    +    assertEquals("deserialization gives the wrong object", t, row);
    +
    +    // Serialize
    +    final ArrayWritable serializedArr = (ArrayWritable) serDe.serialize(row, oi);
    +    assertEquals("size correct after serialization", serDe.getSerDeStats().getRawDataSize(),
    +        serializedArr.get().length);
    +    assertTrue("serialized object should be equal to starting object",
    +        arrayWritableEquals(t, serializedArr));
    +  }
    +
    +  private Properties createProperties() {
    +    final Properties tbl = new Properties();
    +
    +    // Set the configuration parameters
    +    tbl.setProperty("columns", "ashort,aint,along,adouble,adecimal,astring,alist");
    --- End diff --
   
     @xubo245 I have already fixed the spell formate you referred.


---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata issue #2969: [CARBONDATA-3127]Fix the TestCarbonSerde exception

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user zzcclp commented on the issue:

    https://github.com/apache/carbondata/pull/2969
 
    LGTM


---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata issue #2969: [CARBONDATA-3127]Fix the TestCarbonSerde exception

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user xubo245 commented on the issue:

    https://github.com/apache/carbondata/pull/2969
 
    LGTM


---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata pull request #2969: [CARBONDATA-3127]Fix the TestCarbonSerde exce...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user xubo245 commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/2969#discussion_r238154723
 
    --- Diff: integration/hive/src/test/java/org/apache/carbondata/hive/TestCarbonSerDe.java ---
    @@ -0,0 +1,133 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *    http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +package org.apache.carbondata.hive;
    +
    +import junit.framework.TestCase;
    +import org.apache.hadoop.conf.Configuration;
    +import org.apache.hadoop.hive.common.type.HiveDecimal;
    +import org.apache.hadoop.hive.serde2.SerDeException;
    +import org.apache.hadoop.hive.serde2.SerDeUtils;
    +import org.apache.hadoop.hive.serde2.io.DoubleWritable;
    +import org.apache.hadoop.hive.serde2.io.HiveDecimalWritable;
    +import org.apache.hadoop.hive.serde2.io.ShortWritable;
    +import org.apache.hadoop.hive.serde2.objectinspector.StructObjectInspector;
    +import org.apache.hadoop.io.*;
    +import org.junit.Test;
    +
    +import java.util.Properties;
    +
    +public class TestCarbonSerDe extends TestCase {
    +    @Test
    +    public void testCarbonHiveSerDe() throws Throwable {
    +        try {
    +            // Create the SerDe
    +            System.out.println("test: testCarbonHiveSerDe");
    +
    +            final CarbonHiveSerDe serDe = new CarbonHiveSerDe();
    +            final Configuration conf = new Configuration();
    +            final Properties tbl = createProperties();
    +            SerDeUtils.initializeSerDe(serDe, conf, tbl, null);
    +
    +            // Data
    +            final Writable[] arr = new Writable[7];
    +
    +            //primitive types
    +            arr[0] = new ShortWritable((short) 456);
    +            arr[1] = new IntWritable(789);
    +            arr[2] = new LongWritable(1000l);
    +            arr[3] = new DoubleWritable(5.3);
    +            arr[4] = new HiveDecimalWritable(HiveDecimal.create(1));
    +            arr[5] = new Text("carbonSerde binary".getBytes("UTF-8"));
    --- End diff --
   
    Please take care the spell format, including the case sensitive,for example:CarbonSerDe


---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata pull request #2969: [CARBONDATA-3127]Fix the TestCarbonSerde exce...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user SteNicholas commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/2969#discussion_r238936639
 
    --- Diff: integration/hive/src/test/java/org/apache/carbondata/hive/TestCarbonSerDe.java ---
    @@ -0,0 +1,133 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *    http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +package org.apache.carbondata.hive;
    +
    +import junit.framework.TestCase;
    +import org.apache.hadoop.conf.Configuration;
    +import org.apache.hadoop.hive.common.type.HiveDecimal;
    +import org.apache.hadoop.hive.serde2.SerDeException;
    +import org.apache.hadoop.hive.serde2.SerDeUtils;
    +import org.apache.hadoop.hive.serde2.io.DoubleWritable;
    +import org.apache.hadoop.hive.serde2.io.HiveDecimalWritable;
    +import org.apache.hadoop.hive.serde2.io.ShortWritable;
    +import org.apache.hadoop.hive.serde2.objectinspector.StructObjectInspector;
    +import org.apache.hadoop.io.*;
    +import org.junit.Test;
    +
    +import java.util.Properties;
    +
    +public class TestCarbonSerDe extends TestCase {
    +    @Test
    +    public void testCarbonHiveSerDe() throws Throwable {
    +        try {
    +            // Create the SerDe
    +            System.out.println("test: testCarbonHiveSerDe");
    +
    +            final CarbonHiveSerDe serDe = new CarbonHiveSerDe();
    +            final Configuration conf = new Configuration();
    +            final Properties tbl = createProperties();
    +            SerDeUtils.initializeSerDe(serDe, conf, tbl, null);
    +
    +            // Data
    +            final Writable[] arr = new Writable[7];
    +
    +            //primitive types
    +            arr[0] = new ShortWritable((short) 456);
    +            arr[1] = new IntWritable(789);
    +            arr[2] = new LongWritable(1000l);
    +            arr[3] = new DoubleWritable(5.3);
    +            arr[4] = new HiveDecimalWritable(HiveDecimal.create(1));
    +            arr[5] = new Text("carbonSerde binary".getBytes("UTF-8"));
    --- End diff --
   
    @xubo245 Sorry to forget string spell format,please review it.Sorry for the inconvenience.


---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata issue #2969: [CARBONDATA-3127]Fix the TestCarbonSerde exception

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:

    https://github.com/apache/carbondata/pull/2969
 
    Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder2.1/1640/



---
1234 ... 6