Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2738 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder2.1/510/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2738 Build Failed with Spark 2.2.1, Please check CI http://95.216.28.178:8080/job/ApacheCarbonPRBuilder1/690/ --- |
In reply to this post by qiuchenjian-2
Github user xubo245 commented on the issue:
https://github.com/apache/carbondata/pull/2738 retest this please --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2738 Build Success with Spark 2.2.1, Please check CI http://95.216.28.178:8080/job/ApacheCarbonPRBuilder1/702/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2738 Build Success with Spark 2.3.1, Please check CI http://136.243.101.176:8080/job/carbondataprbuilder2.3/8769/ --- |
In reply to this post by qiuchenjian-2
Github user KanakaKumar commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2738#discussion_r220558655 --- Diff: store/sdk/src/main/java/org/apache/carbondata/sdk/file/CarbonReaderBuilder.java --- @@ -101,6 +101,20 @@ public CarbonReaderBuilder withHadoopConf(Configuration conf) { return this; } + + public CarbonReaderBuilder withHadoopConf(String[] args) { --- End diff -- I think this API is not clear. Instead we can support a method like withHadoopConf(String key, String value) then we don't have to parse in SDK java layer. C JNI layer can be invoked for every property user wants to overwrite --- |
In reply to this post by qiuchenjian-2
Github user KanakaKumar commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2738#discussion_r220559298 --- Diff: store/sdk/src/main/java/org/apache/carbondata/sdk/file/CarbonReaderBuilder.java --- @@ -101,6 +101,20 @@ public CarbonReaderBuilder withHadoopConf(Configuration conf) { return this; } + + public CarbonReaderBuilder withHadoopConf(String[] args) { + Configuration configuration = new Configuration(); --- End diff -- We should not create every time a new configuration. Should create once for first time and then update the value for all further calls to this method --- |
In reply to this post by qiuchenjian-2
Github user KanakaKumar commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2738#discussion_r220560348 --- Diff: store/sdk/src/main/java/org/apache/carbondata/sdk/file/CarbonReader.java --- @@ -90,6 +91,33 @@ public T readNextRow() throws IOException, InterruptedException { return currentReader.getCurrentValue(); } + /** + * Read and return next string row object + */ + public Object[] readNextStringRow() throws IOException, InterruptedException { --- End diff -- Object conversion to String is not suggested. The representation for the values may change for objects like double, date. So, please use adapter in JNI layer for each of the java data types to c data type --- |
In reply to this post by qiuchenjian-2
Github user KanakaKumar commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2738#discussion_r220562265 --- Diff: store/sdk/pom.xml --- @@ -39,6 +39,16 @@ <artifactId>hadoop-aws</artifactId> <version>${hadoop.version}</version> </dependency> + <dependency> + <groupId>org.apache.httpcomponents</groupId> + <artifactId>httpclient</artifactId> --- End diff -- httpclient version number dependent may be different in different hadoop versions. So, it should be downloaded as part of carbon-parent pom. Can you verify & avoid this ? --- |
In reply to this post by qiuchenjian-2
Github user KanakaKumar commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2738#discussion_r220565596 --- Diff: store/sdk/pom.xml --- @@ -39,6 +39,16 @@ <artifactId>hadoop-aws</artifactId> <version>${hadoop.version}</version> </dependency> + <dependency> + <groupId>org.apache.httpcomponents</groupId> + <artifactId>httpclient</artifactId> + <version>4.2</version> + </dependency> + <dependency> + <groupId>org.apache.hadoop</groupId> + <artifactId>hadoop-common</artifactId> --- End diff -- carbondata-hadoop already depends on hadoop. Why to add explicitly here? --- |
In reply to this post by qiuchenjian-2
Github user KanakaKumar commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2738#discussion_r220566235 --- Diff: store/CSDK/CarbonReader.cpp --- @@ -0,0 +1,97 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +#include "CarbonReader.h" +#include <jni.h> + +jobject CarbonReader::builder(JNIEnv *env, char *path, char *tableName) { + + jniEnv = env; + jclass carbonReaderClass = env->FindClass("org/apache/carbondata/sdk/file/CarbonReader"); + jmethodID carbonReaderBuilderID = env->GetStaticMethodID(carbonReaderClass, "builder", + "(Ljava/lang/String;Ljava/lang/String;)Lorg/apache/carbondata/sdk/file/CarbonReaderBuilder;"); + jstring jpath = env->NewStringUTF(path); + jstring jtableName = env->NewStringUTF(tableName); + jvalue args[2]; + args[0].l = jpath; + args[1].l = jtableName; + carbonReaderBuilderObject = env->CallStaticObjectMethodA(carbonReaderClass, carbonReaderBuilderID, args); + return carbonReaderBuilderObject; +} + +jobject CarbonReader::projection(int argc, char *argv[]) { + jclass carbonReaderBuilderClass = jniEnv->GetObjectClass(carbonReaderBuilderObject); + jmethodID buildID = jniEnv->GetMethodID(carbonReaderBuilderClass, "projection", + "([Ljava/lang/String;)Lorg/apache/carbondata/sdk/file/CarbonReaderBuilder;"); + jclass objectArrayClass = jniEnv->FindClass("Ljava/lang/String;"); + jobjectArray array = jniEnv->NewObjectArray(argc, objectArrayClass, NULL); + for (int i = 0; i < argc; ++i) { + jstring value = jniEnv->NewStringUTF(argv[i]); + jniEnv->SetObjectArrayElement(array, i, value); + } + + jvalue args[1]; + args[0].l = array; + carbonReaderBuilderObject = jniEnv->CallObjectMethodA(carbonReaderBuilderObject, buildID, args); + return carbonReaderBuilderObject; +} + +jobject CarbonReader::withHadoopConf(int argc, char *argv[]) { --- End diff -- As suggested in another comment, let's add simple API to accept Key & value. --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2738 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder2.1/524/ --- |
In reply to this post by qiuchenjian-2
Github user xubo245 commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2738#discussion_r220773227 --- Diff: store/sdk/src/main/java/org/apache/carbondata/sdk/file/CarbonReaderBuilder.java --- @@ -101,6 +101,20 @@ public CarbonReaderBuilder withHadoopConf(Configuration conf) { return this; } + + public CarbonReaderBuilder withHadoopConf(String[] args) { + Configuration configuration = new Configuration(); --- End diff -- this is for replacing withHadoopConf(Configuration conf) in java side, the old configuration will be replaced by new configuration --- |
In reply to this post by qiuchenjian-2
Github user xubo245 commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2738#discussion_r220774262 --- Diff: store/sdk/src/main/java/org/apache/carbondata/sdk/file/CarbonReaderBuilder.java --- @@ -101,6 +101,20 @@ public CarbonReaderBuilder withHadoopConf(Configuration conf) { return this; } + + public CarbonReaderBuilder withHadoopConf(String[] args) { --- End diff -- yeah. but java layer don't support withHadoopConf(String key, String value), I think we should both support in java and c layer. and config(key: String, value: String) is better than withHadoopConf(String key, String value) --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2738 Build Success with Spark 2.2.1, Please check CI http://95.216.28.178:8080/job/ApacheCarbonPRBuilder1/786/ --- |
In reply to this post by qiuchenjian-2
Github user xubo245 commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2738#discussion_r220784150 --- Diff: store/sdk/src/main/java/org/apache/carbondata/sdk/file/CarbonReader.java --- @@ -90,6 +91,33 @@ public T readNextRow() throws IOException, InterruptedException { return currentReader.getCurrentValue(); } + /** + * Read and return next string row object + */ + public Object[] readNextStringRow() throws IOException, InterruptedException { --- End diff -- I don't know in C layer , how to read int, double, date from mix object[] that includes string, int, date. only can read data string now --- |
In reply to this post by qiuchenjian-2
Github user xubo245 commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2738#discussion_r220784376 --- Diff: store/sdk/src/main/java/org/apache/carbondata/sdk/file/CarbonReader.java --- @@ -90,6 +91,33 @@ public T readNextRow() throws IOException, InterruptedException { return currentReader.getCurrentValue(); } + /** + * Read and return next string row object + */ + public Object[] readNextStringRow() throws IOException, InterruptedException { --- End diff -- I convert double and date to string first, then return string[]. So the values didn't change. --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2738 Build Success with Spark 2.3.1, Please check CI http://136.243.101.176:8080/job/carbondataprbuilder2.3/8853/ --- |
In reply to this post by qiuchenjian-2
Github user KanakaKumar commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2738#discussion_r220802212 --- Diff: store/sdk/src/main/java/org/apache/carbondata/sdk/file/CarbonReaderBuilder.java --- @@ -101,6 +101,20 @@ public CarbonReaderBuilder withHadoopConf(Configuration conf) { return this; } + + public CarbonReaderBuilder withHadoopConf(String[] args) { --- End diff -- Yes, we can add the withHadoopConf(String key, String value) in java API also. --- |
In reply to this post by qiuchenjian-2
Github user KanakaKumar commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2738#discussion_r220802829 --- Diff: store/sdk/src/main/java/org/apache/carbondata/sdk/file/CarbonReaderBuilder.java --- @@ -101,6 +101,20 @@ public CarbonReaderBuilder withHadoopConf(Configuration conf) { return this; } + + public CarbonReaderBuilder withHadoopConf(String[] args) { + Configuration configuration = new Configuration(); --- End diff -- My comment is the same, we should not replace entire conf object every time. Change this method as withHadoopConf(String key, String value) and update property to a class level hadoopConfiguraiton variable. --- |
Free forum by Nabble | Edit this page |