[GitHub] carbondata pull request #1255: [CARBONDATA-1375] clean hive pom

classic Classic list List threaded Threaded
12 messages Options
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata pull request #1255: [CARBONDATA-1375] clean hive pom

qiuchenjian-2
GitHub user cenyuhai opened a pull request:

    https://github.com/apache/carbondata/pull/1255

    [CARBONDATA-1375] clean hive pom

    clean hive pom

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/cenyuhai/incubator-carbondata CARBONDATA-1375

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/carbondata/pull/1255.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #1255
   
----
commit ec46a98d75263240e9ed3060c8dd3148bd33f7f4
Author: CenYuhai <[hidden email]>
Date:   2017-08-13T12:21:34Z

    clean hive pom

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [hidden email] or file a JIRA ticket
with INFRA.
---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata issue #1255: [CARBONDATA-1375] clean hive pom

qiuchenjian-2
Github user cenyuhai commented on the issue:

    https://github.com/apache/carbondata/pull/1255
 
    retest please


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [hidden email] or file a JIRA ticket
with INFRA.
---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata issue #1255: [CARBONDATA-1375] clean hive pom

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:

    https://github.com/apache/carbondata/pull/1255
 
    SDV Build Success with Spark 2.1, Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/193/



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [hidden email] or file a JIRA ticket
with INFRA.
---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata issue #1255: [CARBONDATA-1375] clean hive pom

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user chenliang613 commented on the issue:

    https://github.com/apache/carbondata/pull/1255
 
    retest this please


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [hidden email] or file a JIRA ticket
with INFRA.
---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata issue #1255: [CARBONDATA-1375] clean hive pom

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user chenliang613 commented on the issue:

    https://github.com/apache/carbondata/pull/1255
 
    @cenyuhai  can you explain this PR, which part you removed, which part you added, and why ?
    Just i reviewed, only some pom script be changed the position, very less changed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [hidden email] or file a JIRA ticket
with INFRA.
---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata issue #1255: [CARBONDATA-1375] clean hive pom

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user cenyuhai commented on the issue:

    https://github.com/apache/carbondata/pull/1255
 
    1.use hadoop.version instead of 2.6.0
    2.use hive.version instead of 1.2.1
    3.remove thrift
    4.remove zookeeper
    5.remove spark-hive and spark-sql


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [hidden email] or file a JIRA ticket
with INFRA.
---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata issue #1255: [CARBONDATA-1375] clean hive pom

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:

    https://github.com/apache/carbondata/pull/1255
 
    SDV Build Success with Spark 2.1, Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/240/



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [hidden email] or file a JIRA ticket
with INFRA.
---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata pull request #1255: [CARBONDATA-1375] clean hive pom

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user anubhav100 commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1255#discussion_r134084229
 
    --- Diff: integration/hive/pom.xml ---
    @@ -36,122 +36,96 @@
     
         <dependencies>
             <dependency>
    -            <groupId>org.apache.hive</groupId>
    -            <artifactId>hive-exec</artifactId>
    -            <version>${hive.version}</version>
    -            <exclusions>
    -                <exclusion>
    -                    <groupId>org.apache.hive</groupId>
    -                    <artifactId>hive-metastore</artifactId>
    -                </exclusion>
    -                <exclusion>
    -                    <groupId>org.apache.hive</groupId>
    -                    <artifactId>hive-shims</artifactId>
    -                </exclusion>
    -                <exclusion>
    -                    <groupId>org.apache.hive</groupId>
    -                    <artifactId>hive-ant</artifactId>
    -                </exclusion>
    -                <exclusion>
    -                    <groupId>org.apache.hive</groupId>
    -                    <artifactId>spark-client</artifactId>
    -                </exclusion>
    -                <exclusion>
    -                    <artifactId>kryo</artifactId>
    -                    <groupId>com.esotericsoftware.kryo</groupId>
    -                </exclusion>
    -            </exclusions>
    -            <scope>compile</scope>
    +          <groupId>org.apache.carbondata</groupId>
    +          <artifactId>carbondata-hadoop</artifactId>
    +          <version>${project.version}</version>
    +          <exclusions>
    +            <exclusion>
    +              <groupId>org.apache.spark</groupId>
    +              <artifactId>spark-sql_2.10</artifactId>
    +            </exclusion>
    +          </exclusions>
             </dependency>
             <dependency>
    -            <groupId>org.apache.zookeeper</groupId>
    -            <artifactId>zookeeper</artifactId>
    -            <version>3.4.7</version>
    -            <exclusions>
    -                <exclusion>
    -                    <groupId>jline</groupId>
    -                    <artifactId>jline</artifactId>
    -                </exclusion>
    -            </exclusions>
    +          <groupId>org.apache.carbondata</groupId>
    +          <artifactId>carbondata-spark2</artifactId>
    +          <version>${project.version}</version>
    +          <exclusions>
    +            <exclusion>
    +              <groupId>org.apache.spark</groupId>
    +              <artifactId>spark-hive-thriftserver_2.10</artifactId>
    +            </exclusion>
    +            <exclusion>
    +              <groupId>org.apache.spark</groupId>
    +              <artifactId>spark-repl_2.10</artifactId>
    +            </exclusion>
    +            <exclusion>
    +              <groupId>org.apache.spark</groupId>
    +              <artifactId>spark-sql_2.10</artifactId>
    +            </exclusion>
    +          </exclusions>
             </dependency>
             <dependency>
    -            <groupId>org.apache.carbondata</groupId>
    -            <artifactId>carbondata-spark2</artifactId>
    -            <version>${project.version}</version>
    -            <exclusions>
    -                <exclusion>
    -                    <groupId>org.apache.spark</groupId>
    -                    <artifactId>spark-hive-thriftserver_2.10</artifactId>
    -                </exclusion>
    -                <exclusion>
    -                    <groupId>org.apache.spark</groupId>
    -                    <artifactId>spark-repl_2.10</artifactId>
    -                </exclusion>
    -                <exclusion>
    -                    <groupId>org.apache.spark</groupId>
    -                    <artifactId>spark-sql_2.10</artifactId>
    -                </exclusion>
    -            </exclusions>
    +          <groupId>org.apache.hive</groupId>
    +          <artifactId>hive-exec</artifactId>
    +          <version>${hive.version}</version>
    +          <exclusions>
    +            <exclusion>
    +              <groupId>org.apache.hive</groupId>
    +              <artifactId>hive-metastore</artifactId>
    +            </exclusion>
    +            <exclusion>
    +              <groupId>org.apache.hive</groupId>
    +              <artifactId>hive-shims</artifactId>
    +            </exclusion>
    +            <exclusion>
    +              <groupId>org.apache.hive</groupId>
    +              <artifactId>hive-ant</artifactId>
    +            </exclusion>
    +            <exclusion>
    +              <groupId>org.apache.hive</groupId>
    +              <artifactId>spark-client</artifactId>
    +            </exclusion>
    +            <exclusion>
    +              <artifactId>kryo</artifactId>
    +              <groupId>com.esotericsoftware.kryo</groupId>
    +            </exclusion>
    +          </exclusions>
    +          <scope>compile</scope>
             </dependency>
             <dependency>
    -            <groupId>org.apache.hive</groupId>
    -            <artifactId>hive-service</artifactId>
    -            <version>1.2.1</version>
    +          <groupId>org.apache.hive</groupId>
    +          <artifactId>hive-service</artifactId>
    +          <version>${hive.version}</version>
             </dependency>
             <dependency>
    -            <groupId>org.apache.hive</groupId>
    -            <artifactId>hive-jdbc</artifactId>
    -            <version>1.2.1</version>
    -        </dependency>
    -        <dependency>
    -            <groupId>org.apache.spark</groupId>
    -            <artifactId>spark-hive_${scala.binary.version}</artifactId>
    -            <version>${spark.version}</version>
    -        </dependency>
    -        <dependency>
    -            <groupId>org.apache.spark</groupId>
    -            <artifactId>spark-sql_${scala.binary.version}</artifactId>
    -            <version>${spark.version}</version>
    +          <groupId>org.apache.hive</groupId>
    +          <artifactId>hive-jdbc</artifactId>
    +          <version>${hive.version}</version>
             </dependency>
             <dependency>
               <groupId>org.apache.spark</groupId>
               <artifactId>spark-catalyst_${scala.binary.version}</artifactId>
               <version>${spark.version}</version>
             </dependency>
             <dependency>
    -            <groupId>org.apache.httpcomponents</groupId>
    -            <artifactId>httpclient</artifactId>
    -            <version>4.3.4</version>
    -        </dependency>
    -        <dependency>
    -            <groupId>org.apache.httpcomponents</groupId>
    -            <artifactId>httpcore</artifactId>
    -            <version>4.3-alpha1</version>
    -        </dependency>
    -        <dependency>
    -            <groupId>org.apache.hadoop</groupId>
    -            <artifactId>hadoop-client</artifactId>
    -            <version>2.6.0</version>
    +          <groupId>org.apache.httpcomponents</groupId>
    +          <artifactId>httpclient</artifactId>
    +          <version>4.3.4</version>
             </dependency>
             <dependency>
    -            <groupId>org.apache.thrift</groupId>
    -            <artifactId>libfb303</artifactId>
    -            <version>0.9.3</version>
    +          <groupId>org.apache.httpcomponents</groupId>
    +          <artifactId>httpcore</artifactId>
    +          <version>4.3-alpha1</version>
             </dependency>
             <dependency>
    -            <groupId>org.apache.carbondata</groupId>
    -            <artifactId>carbondata-hadoop</artifactId>
    -            <version>${project.version}</version>
    -            <exclusions>
    -              <exclusion>
    -                <groupId>org.apache.spark</groupId>
    -                <artifactId>spark-sql_2.10</artifactId>
    -              </exclusion>
    -            </exclusions>
    +          <groupId>org.apache.hadoop</groupId>
    +          <artifactId>hadoop-client</artifactId>
    +          <version>${hadoop.version}</version>
             </dependency>
             <dependency>
    -            <groupId>junit</groupId>
    -            <artifactId>junit</artifactId>
    +          <groupId>junit</groupId>
    +          <artifactId>junit</artifactId>
    --- End diff --
   
    @cenyuhai you can remove junit as well there are no test cases


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [hidden email] or file a JIRA ticket
with INFRA.
---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata pull request #1255: [CARBONDATA-1375] clean hive pom

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user cenyuhai commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1255#discussion_r134194865
 
    --- Diff: integration/hive/pom.xml ---
    @@ -36,122 +36,96 @@
     
         <dependencies>
             <dependency>
    -            <groupId>org.apache.hive</groupId>
    -            <artifactId>hive-exec</artifactId>
    -            <version>${hive.version}</version>
    -            <exclusions>
    -                <exclusion>
    -                    <groupId>org.apache.hive</groupId>
    -                    <artifactId>hive-metastore</artifactId>
    -                </exclusion>
    -                <exclusion>
    -                    <groupId>org.apache.hive</groupId>
    -                    <artifactId>hive-shims</artifactId>
    -                </exclusion>
    -                <exclusion>
    -                    <groupId>org.apache.hive</groupId>
    -                    <artifactId>hive-ant</artifactId>
    -                </exclusion>
    -                <exclusion>
    -                    <groupId>org.apache.hive</groupId>
    -                    <artifactId>spark-client</artifactId>
    -                </exclusion>
    -                <exclusion>
    -                    <artifactId>kryo</artifactId>
    -                    <groupId>com.esotericsoftware.kryo</groupId>
    -                </exclusion>
    -            </exclusions>
    -            <scope>compile</scope>
    +          <groupId>org.apache.carbondata</groupId>
    +          <artifactId>carbondata-hadoop</artifactId>
    +          <version>${project.version}</version>
    +          <exclusions>
    +            <exclusion>
    +              <groupId>org.apache.spark</groupId>
    +              <artifactId>spark-sql_2.10</artifactId>
    +            </exclusion>
    +          </exclusions>
             </dependency>
             <dependency>
    -            <groupId>org.apache.zookeeper</groupId>
    -            <artifactId>zookeeper</artifactId>
    -            <version>3.4.7</version>
    -            <exclusions>
    -                <exclusion>
    -                    <groupId>jline</groupId>
    -                    <artifactId>jline</artifactId>
    -                </exclusion>
    -            </exclusions>
    +          <groupId>org.apache.carbondata</groupId>
    +          <artifactId>carbondata-spark2</artifactId>
    +          <version>${project.version}</version>
    +          <exclusions>
    +            <exclusion>
    +              <groupId>org.apache.spark</groupId>
    +              <artifactId>spark-hive-thriftserver_2.10</artifactId>
    +            </exclusion>
    +            <exclusion>
    +              <groupId>org.apache.spark</groupId>
    +              <artifactId>spark-repl_2.10</artifactId>
    +            </exclusion>
    +            <exclusion>
    +              <groupId>org.apache.spark</groupId>
    +              <artifactId>spark-sql_2.10</artifactId>
    +            </exclusion>
    +          </exclusions>
             </dependency>
             <dependency>
    -            <groupId>org.apache.carbondata</groupId>
    -            <artifactId>carbondata-spark2</artifactId>
    -            <version>${project.version}</version>
    -            <exclusions>
    -                <exclusion>
    -                    <groupId>org.apache.spark</groupId>
    -                    <artifactId>spark-hive-thriftserver_2.10</artifactId>
    -                </exclusion>
    -                <exclusion>
    -                    <groupId>org.apache.spark</groupId>
    -                    <artifactId>spark-repl_2.10</artifactId>
    -                </exclusion>
    -                <exclusion>
    -                    <groupId>org.apache.spark</groupId>
    -                    <artifactId>spark-sql_2.10</artifactId>
    -                </exclusion>
    -            </exclusions>
    +          <groupId>org.apache.hive</groupId>
    +          <artifactId>hive-exec</artifactId>
    +          <version>${hive.version}</version>
    +          <exclusions>
    +            <exclusion>
    +              <groupId>org.apache.hive</groupId>
    +              <artifactId>hive-metastore</artifactId>
    +            </exclusion>
    +            <exclusion>
    +              <groupId>org.apache.hive</groupId>
    +              <artifactId>hive-shims</artifactId>
    +            </exclusion>
    +            <exclusion>
    +              <groupId>org.apache.hive</groupId>
    +              <artifactId>hive-ant</artifactId>
    +            </exclusion>
    +            <exclusion>
    +              <groupId>org.apache.hive</groupId>
    +              <artifactId>spark-client</artifactId>
    +            </exclusion>
    +            <exclusion>
    +              <artifactId>kryo</artifactId>
    +              <groupId>com.esotericsoftware.kryo</groupId>
    +            </exclusion>
    +          </exclusions>
    +          <scope>compile</scope>
             </dependency>
             <dependency>
    -            <groupId>org.apache.hive</groupId>
    -            <artifactId>hive-service</artifactId>
    -            <version>1.2.1</version>
    +          <groupId>org.apache.hive</groupId>
    +          <artifactId>hive-service</artifactId>
    +          <version>${hive.version}</version>
             </dependency>
             <dependency>
    -            <groupId>org.apache.hive</groupId>
    -            <artifactId>hive-jdbc</artifactId>
    -            <version>1.2.1</version>
    -        </dependency>
    -        <dependency>
    -            <groupId>org.apache.spark</groupId>
    -            <artifactId>spark-hive_${scala.binary.version}</artifactId>
    -            <version>${spark.version}</version>
    -        </dependency>
    -        <dependency>
    -            <groupId>org.apache.spark</groupId>
    -            <artifactId>spark-sql_${scala.binary.version}</artifactId>
    -            <version>${spark.version}</version>
    +          <groupId>org.apache.hive</groupId>
    +          <artifactId>hive-jdbc</artifactId>
    +          <version>${hive.version}</version>
             </dependency>
             <dependency>
               <groupId>org.apache.spark</groupId>
               <artifactId>spark-catalyst_${scala.binary.version}</artifactId>
               <version>${spark.version}</version>
             </dependency>
             <dependency>
    -            <groupId>org.apache.httpcomponents</groupId>
    -            <artifactId>httpclient</artifactId>
    -            <version>4.3.4</version>
    -        </dependency>
    -        <dependency>
    -            <groupId>org.apache.httpcomponents</groupId>
    -            <artifactId>httpcore</artifactId>
    -            <version>4.3-alpha1</version>
    -        </dependency>
    -        <dependency>
    -            <groupId>org.apache.hadoop</groupId>
    -            <artifactId>hadoop-client</artifactId>
    -            <version>2.6.0</version>
    +          <groupId>org.apache.httpcomponents</groupId>
    +          <artifactId>httpclient</artifactId>
    +          <version>4.3.4</version>
             </dependency>
             <dependency>
    -            <groupId>org.apache.thrift</groupId>
    -            <artifactId>libfb303</artifactId>
    -            <version>0.9.3</version>
    +          <groupId>org.apache.httpcomponents</groupId>
    +          <artifactId>httpcore</artifactId>
    +          <version>4.3-alpha1</version>
             </dependency>
             <dependency>
    -            <groupId>org.apache.carbondata</groupId>
    -            <artifactId>carbondata-hadoop</artifactId>
    -            <version>${project.version}</version>
    -            <exclusions>
    -              <exclusion>
    -                <groupId>org.apache.spark</groupId>
    -                <artifactId>spark-sql_2.10</artifactId>
    -              </exclusion>
    -            </exclusions>
    +          <groupId>org.apache.hadoop</groupId>
    +          <artifactId>hadoop-client</artifactId>
    +          <version>${hadoop.version}</version>
             </dependency>
             <dependency>
    -            <groupId>junit</groupId>
    -            <artifactId>junit</artifactId>
    +          <groupId>junit</groupId>
    +          <artifactId>junit</artifactId>
    --- End diff --
   
    ok


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [hidden email] or file a JIRA ticket
with INFRA.
---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata issue #1255: [CARBONDATA-1375] clean hive pom

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user cenyuhai commented on the issue:

    https://github.com/apache/carbondata/pull/1255
 
    retest please


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [hidden email] or file a JIRA ticket
with INFRA.
---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata issue #1255: [CARBONDATA-1375] clean hive pom

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user chenliang613 commented on the issue:

    https://github.com/apache/carbondata/pull/1255
 
    LGTM


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [hidden email] or file a JIRA ticket
with INFRA.
---
Reply | Threaded
Open this post in threaded view
|

[GitHub] carbondata pull request #1255: [CARBONDATA-1375] clean hive pom

qiuchenjian-2
In reply to this post by qiuchenjian-2
Github user asfgit closed the pull request at:

    https://github.com/apache/carbondata/pull/1255


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [hidden email] or file a JIRA ticket
with INFRA.
---