[GitHub] carbondata pull request #1332: [WIP]Regenerate hive saved data incase test c...

classic Classic list List threaded Threaded
12 messages Options
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata pull request #1332: [WIP]Regenerate hive saved data incase test c...

qiuchenjian-2
GitHub user sraghunandan opened a pull request:

    https://github.com/apache/carbondata/pull/1332

    [WIP]Regenerate hive saved data incase test case fails

   

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/sraghunandan/carbondata-1 disable_hive_result_caching

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/carbondata/pull/1332.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #1332
   
----
commit 19950412b431cc96a346a5f02fe65f4dfd66c7c9
Author: sraghunandan <[hidden email]>
Date:   2017-09-06T09:39:44Z

    Regenerate hive saved data incase test case fails
    Reasons:
    1.May be the test case changed
    2.May be the input data changed
    3.May be the environment changed

----


---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata issue #1332: [WIP]Regenerate hive saved data incase test case fai...

qiuchenjian-2
Github user ravipesala commented on the issue:

    https://github.com/apache/carbondata/pull/1332
 
    SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/559/



---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata issue #1332: [CARBONDATA-1456]Regenerate cached hive results if c...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:

    https://github.com/apache/carbondata/pull/1332
 
    SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/565/



---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata issue #1332: [CARBONDATA-1456]Regenerate cached hive results if c...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user sraghunandan commented on the issue:

    https://github.com/apache/carbondata/pull/1332
 
    ok to test


---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata issue #1332: [CARBONDATA-1456]Regenerate cached hive results if c...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:

    https://github.com/apache/carbondata/pull/1332
 
    Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder/3434/



---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata issue #1332: [CARBONDATA-1456]Regenerate cached hive results if c...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:

    https://github.com/apache/carbondata/pull/1332
 
    SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/577/



---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata pull request #1332: [CARBONDATA-1456]Regenerate cached hive resul...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user ravipesala commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1332#discussion_r137439030
 
    --- Diff: integration/spark-common-cluster-test/src/test/scala/org/apache/spark/sql/common/util/QueryTest.scala ---
    @@ -84,22 +82,34 @@ class QueryTest extends PlanTest with Suite {
         checkAnswer(df, expectedAnswer.collect())
       }
     
    -  protected def checkAnswer(carbon: String, hive: String, uniqueIdentifier:String): Unit = {
    -    val path = TestQueryExecutor.hiveresultpath + "/"+uniqueIdentifier
    +  protected def checkAnswer(carbon: String, hive: String, uniqueIdentifier: String): Unit = {
    +    val path = TestQueryExecutor.hiveresultpath + "/" + uniqueIdentifier
         if (FileFactory.isFileExist(path, FileFactory.getFileType(path))) {
    -      val objinp = new ObjectInputStream(FileFactory.getDataInputStream(path, FileFactory.getFileType(path)))
    +      val objinp = new ObjectInputStream(FileFactory
    +        .getDataInputStream(path, FileFactory.getFileType(path)))
           val rows = objinp.readObject().asInstanceOf[Array[Row]]
           objinp.close()
    -      checkAnswer(sql(carbon), rows)
    +      QueryTest.checkAnswer(sql(carbon), rows) match {
    +        case Some(errorMessage) => {
    +          FileFactory.deleteFile(path, FileFactory.getFileType(path))
    +          writeAndCheckAnswer(carbon, hive, path)
    --- End diff --
   
    Doesn't it go to endless loop when test fails?


---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata issue #1332: [CARBONDATA-1456]Regenerate cached hive results if c...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:

    https://github.com/apache/carbondata/pull/1332
 
    SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/585/



---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata pull request #1332: [CARBONDATA-1456]Regenerate cached hive resul...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user sraghunandan commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1332#discussion_r137459712
 
    --- Diff: integration/spark-common-cluster-test/src/test/scala/org/apache/spark/sql/common/util/QueryTest.scala ---
    @@ -84,22 +82,34 @@ class QueryTest extends PlanTest with Suite {
         checkAnswer(df, expectedAnswer.collect())
       }
     
    -  protected def checkAnswer(carbon: String, hive: String, uniqueIdentifier:String): Unit = {
    -    val path = TestQueryExecutor.hiveresultpath + "/"+uniqueIdentifier
    +  protected def checkAnswer(carbon: String, hive: String, uniqueIdentifier: String): Unit = {
    +    val path = TestQueryExecutor.hiveresultpath + "/" + uniqueIdentifier
         if (FileFactory.isFileExist(path, FileFactory.getFileType(path))) {
    -      val objinp = new ObjectInputStream(FileFactory.getDataInputStream(path, FileFactory.getFileType(path)))
    +      val objinp = new ObjectInputStream(FileFactory
    +        .getDataInputStream(path, FileFactory.getFileType(path)))
           val rows = objinp.readObject().asInstanceOf[Array[Row]]
           objinp.close()
    -      checkAnswer(sql(carbon), rows)
    +      QueryTest.checkAnswer(sql(carbon), rows) match {
    +        case Some(errorMessage) => {
    +          FileFactory.deleteFile(path, FileFactory.getFileType(path))
    +          writeAndCheckAnswer(carbon, hive, path)
    --- End diff --
   
    i couldn't understand your comment. how it would go to infinite loop?
    we are not using recursive call


---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata pull request #1332: [CARBONDATA-1456]Regenerate cached hive resul...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user ravipesala commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1332#discussion_r137919083
 
    --- Diff: integration/spark-common-cluster-test/src/test/scala/org/apache/spark/sql/common/util/QueryTest.scala ---
    @@ -84,22 +82,34 @@ class QueryTest extends PlanTest with Suite {
         checkAnswer(df, expectedAnswer.collect())
       }
     
    -  protected def checkAnswer(carbon: String, hive: String, uniqueIdentifier:String): Unit = {
    -    val path = TestQueryExecutor.hiveresultpath + "/"+uniqueIdentifier
    +  protected def checkAnswer(carbon: String, hive: String, uniqueIdentifier: String): Unit = {
    +    val path = TestQueryExecutor.hiveresultpath + "/" + uniqueIdentifier
         if (FileFactory.isFileExist(path, FileFactory.getFileType(path))) {
    -      val objinp = new ObjectInputStream(FileFactory.getDataInputStream(path, FileFactory.getFileType(path)))
    +      val objinp = new ObjectInputStream(FileFactory
    +        .getDataInputStream(path, FileFactory.getFileType(path)))
           val rows = objinp.readObject().asInstanceOf[Array[Row]]
           objinp.close()
    -      checkAnswer(sql(carbon), rows)
    +      QueryTest.checkAnswer(sql(carbon), rows) match {
    +        case Some(errorMessage) => {
    +          FileFactory.deleteFile(path, FileFactory.getFileType(path))
    +          writeAndCheckAnswer(carbon, hive, path)
    --- End diff --
   
    Got it , my misunderstanding


---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata issue #1332: [CARBONDATA-1456]Regenerate cached hive results if c...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:

    https://github.com/apache/carbondata/pull/1332
 
    LGTM


---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata pull request #1332: [CARBONDATA-1456]Regenerate cached hive resul...

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user asfgit closed the pull request at:

    https://github.com/apache/carbondata/pull/1332


---