GitHub user lamber-ken opened a pull request:
https://github.com/apache/carbondata/pull/3030 [HOTFIX] Optimize the code style in csdk/sdk markdown doc Be sure to do all of the following checklist to help us incorporate your contribution quickly and easily: - [x] Any interfaces changed? NO - [x] Any backward compatibility impacted? NO - [x] Document update required? YES - [x] Testing done YES Please provide details on - Whether new unit test cases have been added or why no new tests are required? - How it is tested? Please attach test report. - Is it a performance related change? Please attach the performance test report. - Any additional information to help reviewers in testing this change. - [x] For large changes, please consider breaking it into sub-tasks under an umbrella JIRA. OK You can merge this pull request into a Git repository by running: $ git pull https://github.com/BigDataArtisans/carbondata fix-code-style Alternatively you can review and apply these changes as the patch at: https://github.com/apache/carbondata/pull/3030.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #3030 ---- commit e136d7a8433b99027b3e4338ae5ef8d5156788a1 Author: lamber-ken <2217232293@...> Date: 2018-12-27T13:52:21Z [HOTFIX] Optimize the code style in csdk/sdk markdown doc ---- --- |
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/3030 Build Failed with Spark 2.2.1, Please check CI http://95.216.28.178:8080/job/ApacheCarbonPRBuilder1/2253/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/3030 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder2.1/2046/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/3030 Build Success with Spark 2.3.2, Please check CI http://136.243.101.176:8080/job/carbondataprbuilder2.3/10298/ --- |
In reply to this post by qiuchenjian-2
Github user xubo245 commented on the issue:
https://github.com/apache/carbondata/pull/3030 retest this please --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/3030 Build Success with Spark 2.2.1, Please check CI http://95.216.28.178:8080/job/ApacheCarbonPRBuilder1/2265/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/3030 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder2.1/2061/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/3030 Build Success with Spark 2.3.2, Please check CI http://136.243.101.176:8080/job/carbondataprbuilder2.3/10314/ --- |
In reply to this post by qiuchenjian-2
Github user xubo245 commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/3030#discussion_r244438958 --- Diff: docs/csdk-guide.md --- @@ -172,361 +174,360 @@ release the memory and destroy JVM. ## API List ### CarbonWriter ``` - /** - * create a CarbonWriterBuilder object for building carbonWriter, - * CarbonWriterBuilder object can configure different parameter - * - * @param env JNIEnv - * @return CarbonWriterBuilder object - */ - void builder(JNIEnv *env); -``` - -``` - /** - * Sets the output path of the writer builder - * - * @param path is the absolute path where output files are written - * This method must be called when building CarbonWriterBuilder - * @return updated CarbonWriterBuilder - */ - void outputPath(char *path); -``` - -``` - /** - * sets the list of columns that needs to be in sorted order - * - * @param argc argc argument counter, the number of projection column - * @param argv argv is a string array of columns that needs to be sorted. - * If it is null or by default all dimensions are selected for sorting - * If it is empty array, no columns are sorted - */ - void sortBy(int argc, char *argv[]); -``` - -``` - /** - * configure the schema with json style schema - * - * @param jsonSchema json style schema - * @return updated CarbonWriterBuilder - */ - void withCsvInput(char *jsonSchema); -``` - -``` - /** - * Updates the hadoop configuration with the given key value - * - * @param key key word - * @param value value - * @return CarbonWriterBuilder object - */ - void withHadoopConf(char *key, char *value); -``` - -``` - /** - * To support the table properties for writer - * - * @param key properties key - * @param value properties value - */ - void withTableProperty(char *key, char *value); -``` - -``` - /** - * To support the load options for C++ sdk writer - * - * @param options key,value pair of load options. - * supported keys values are - * a. bad_records_logger_enable -- true (write into separate logs), false - * b. bad_records_action -- FAIL, FORCE, IGNORE, REDIRECT - * c. bad_record_path -- path - * d. dateformat -- same as JAVA SimpleDateFormat - * e. timestampformat -- same as JAVA SimpleDateFormat - * f. complex_delimiter_level_1 -- value to Split the complexTypeData - * g. complex_delimiter_level_2 -- value to Split the nested complexTypeData - * h. quotechar - * i. escapechar - * - * Default values are as follows. - * - * a. bad_records_logger_enable -- "false" - * b. bad_records_action -- "FAIL" - * c. bad_record_path -- "" - * d. dateformat -- "" , uses from carbon.properties file - * e. timestampformat -- "", uses from carbon.properties file - * f. complex_delimiter_level_1 -- "$" - * g. complex_delimiter_level_2 -- ":" - * h. quotechar -- "\"" - * i. escapechar -- "\\" - * - * @return updated CarbonWriterBuilder - */ - void withLoadOption(char *key, char *value); +/** + * create a CarbonWriterBuilder object for building carbonWriter, + * CarbonWriterBuilder object can configure different parameter + * + * @param env JNIEnv + * @return CarbonWriterBuilder object + */ +void builder(JNIEnv *env); +``` + +``` +/** + * Sets the output path of the writer builder + * + * @param path is the absolute path where output files are written + * This method must be called when building CarbonWriterBuilder + * @return updated CarbonWriterBuilder + */ +void outputPath(char *path); +``` + +``` +/** + * sets the list of columns that needs to be in sorted order + * + * @param argc argc argument counter, the number of projection column + * @param argv argv is a string array of columns that needs to be sorted. + * If it is null or by default all dimensions are selected for sorting + * If it is empty array, no columns are sorted + */ +void sortBy(int argc, char *argv[]); +``` + +``` +/** + * configure the schema with json style schema + * + * @param jsonSchema json style schema + * @return updated CarbonWriterBuilder + */ +void withCsvInput(char *jsonSchema); +``` + +``` +/** +* Updates the hadoop configuration with the given key value --- End diff -- Please optimize itã --- |
In reply to this post by qiuchenjian-2
Github user xubo245 commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/3030#discussion_r244439663 --- Diff: docs/csdk-guide.md --- @@ -43,114 +43,116 @@ C++ SDK support read batch row. User can set batch by using withBatch(int batch) ## API List ### CarbonReader ``` - /** - * create a CarbonReaderBuilder object for building carbonReader, - * CarbonReaderBuilder object can configure different parameter - * - * @param env JNIEnv - * @param path data store path - * @param tableName table name - * @return CarbonReaderBuilder object - */ - jobject builder(JNIEnv *env, char *path, char *tableName); +/** + * create a CarbonReaderBuilder object for building carbonReader, + * CarbonReaderBuilder object can configure different parameter + * + * @param env JNIEnv + * @param path data store path + * @param tableName table name + * @return CarbonReaderBuilder object + */ +jobject builder(JNIEnv *env, char *path, char *tableName); ``` ``` - /** - * create a CarbonReaderBuilder object for building carbonReader, - * CarbonReaderBuilder object can configure different parameter - * - * @param env JNIEnv - * @param path data store path - * */ - void builder(JNIEnv *env, char *path); +/** + * create a CarbonReaderBuilder object for building carbonReader, + * CarbonReaderBuilder object can configure different parameter + * + * @param env JNIEnv + * @param path data store path + * + */ +void builder(JNIEnv *env, char *path); ``` ``` - /** - * Configure the projection column names of carbon reader - * - * @param argc argument counter - * @param argv argument vector - * @return CarbonReaderBuilder object - */ - jobject projection(int argc, char *argv[]); +/** + * Configure the projection column names of carbon reader + * + * @param argc argument counter + * @param argv argument vector + * @return CarbonReaderBuilder object + */ +jobject projection(int argc, char *argv[]); ``` ``` - /** - * build carbon reader with argument vector - * it support multiple parameter - * like: key=value - * for example: fs.s3a.access.key=XXXX, XXXX is user's access key value - * - * @param argc argument counter - * @param argv argument vector - * @return CarbonReaderBuilder object - **/ - jobject withHadoopConf(int argc, char *argv[]); +/** + * build carbon reader with argument vector + * it support multiple parameter + * like: key=value + * for example: fs.s3a.access.key=XXXX, XXXX is user's access key value + * + * @param argc argument counter + * @param argv argument vector + * @return CarbonReaderBuilder object + * + */ +jobject withHadoopConf(int argc, char *argv[]); ``` ``` - /** - * Sets the batch size of records to read - * - * @param batch batch size - * @return CarbonReaderBuilder object - */ - void withBatch(int batch); +/** + * Sets the batch size of records to read + * + * @param batch batch size + * @return CarbonReaderBuilder object + */ +void withBatch(int batch); ``` ``` - /** - * Configure Row Record Reader for reading. - */ - void withRowRecordReader(); +/** + * Configure Row Record Reader for reading. + */ +void withRowRecordReader(); ``` ``` - /** - * build carbonReader object for reading data - * it support read data from load disk - * - * @return carbonReader object - */ - jobject build(); +/** + * build carbonReader object for reading data + * it support read data from load disk + * + * @return carbonReader object + */ +jobject build(); ``` ``` - /** - * Whether it has next row data - * - * @return boolean value, if it has next row, return true. if it hasn't next row, return false. - */ - jboolean hasNext(); +/** + * Whether it has next row data + * + * @return boolean value, if it has next row, return true. if it hasn't next row, return false. + */ +jboolean hasNext(); ``` ``` - /** - * read next carbonRow from data - * @return carbonRow object of one row - */ - jobject readNextRow(); +/** + * read next carbonRow from data + * @return carbonRow object of one row + */ +jobject readNextRow(); ``` ``` - /** - * read Next Batch Row - * - * @return rows - */ - jobjectArray readNextBatchRow(); +/** + * read Next Batch Row --- End diff -- Please optimize this one. --- |
In reply to this post by qiuchenjian-2
Github user xubo245 commented on the issue:
https://github.com/apache/carbondata/pull/3030 add to whitelist --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/3030 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder2.1/2073/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/3030 Build Success with Spark 2.2.1, Please check CI http://95.216.28.178:8080/job/ApacheCarbonPRBuilder1/2278/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/3030 Build Success with Spark 2.3.2, Please check CI http://136.243.101.176:8080/job/carbondataprbuilder2.3/10327/ --- |
In reply to this post by qiuchenjian-2
Github user lamber-ken commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/3030#discussion_r244449499 --- Diff: docs/csdk-guide.md --- @@ -43,114 +43,116 @@ C++ SDK support read batch row. User can set batch by using withBatch(int batch) ## API List ### CarbonReader ``` - /** - * create a CarbonReaderBuilder object for building carbonReader, - * CarbonReaderBuilder object can configure different parameter - * - * @param env JNIEnv - * @param path data store path - * @param tableName table name - * @return CarbonReaderBuilder object - */ - jobject builder(JNIEnv *env, char *path, char *tableName); +/** + * create a CarbonReaderBuilder object for building carbonReader, + * CarbonReaderBuilder object can configure different parameter + * + * @param env JNIEnv + * @param path data store path + * @param tableName table name + * @return CarbonReaderBuilder object + */ +jobject builder(JNIEnv *env, char *path, char *tableName); ``` ``` - /** - * create a CarbonReaderBuilder object for building carbonReader, - * CarbonReaderBuilder object can configure different parameter - * - * @param env JNIEnv - * @param path data store path - * */ - void builder(JNIEnv *env, char *path); +/** + * create a CarbonReaderBuilder object for building carbonReader, + * CarbonReaderBuilder object can configure different parameter + * + * @param env JNIEnv + * @param path data store path + * + */ +void builder(JNIEnv *env, char *path); ``` ``` - /** - * Configure the projection column names of carbon reader - * - * @param argc argument counter - * @param argv argument vector - * @return CarbonReaderBuilder object - */ - jobject projection(int argc, char *argv[]); +/** + * Configure the projection column names of carbon reader + * + * @param argc argument counter + * @param argv argument vector + * @return CarbonReaderBuilder object + */ +jobject projection(int argc, char *argv[]); ``` ``` - /** - * build carbon reader with argument vector - * it support multiple parameter - * like: key=value - * for example: fs.s3a.access.key=XXXX, XXXX is user's access key value - * - * @param argc argument counter - * @param argv argument vector - * @return CarbonReaderBuilder object - **/ - jobject withHadoopConf(int argc, char *argv[]); +/** + * build carbon reader with argument vector + * it support multiple parameter + * like: key=value + * for example: fs.s3a.access.key=XXXX, XXXX is user's access key value + * + * @param argc argument counter + * @param argv argument vector + * @return CarbonReaderBuilder object + * + */ +jobject withHadoopConf(int argc, char *argv[]); ``` ``` - /** - * Sets the batch size of records to read - * - * @param batch batch size - * @return CarbonReaderBuilder object - */ - void withBatch(int batch); +/** + * Sets the batch size of records to read + * + * @param batch batch size + * @return CarbonReaderBuilder object + */ +void withBatch(int batch); ``` ``` - /** - * Configure Row Record Reader for reading. - */ - void withRowRecordReader(); +/** + * Configure Row Record Reader for reading. + */ +void withRowRecordReader(); ``` ``` - /** - * build carbonReader object for reading data - * it support read data from load disk - * - * @return carbonReader object - */ - jobject build(); +/** + * build carbonReader object for reading data + * it support read data from load disk + * + * @return carbonReader object + */ +jobject build(); ``` ``` - /** - * Whether it has next row data - * - * @return boolean value, if it has next row, return true. if it hasn't next row, return false. - */ - jboolean hasNext(); +/** + * Whether it has next row data + * + * @return boolean value, if it has next row, return true. if it hasn't next row, return false. + */ +jboolean hasNext(); ``` ``` - /** - * read next carbonRow from data - * @return carbonRow object of one row - */ - jobject readNextRow(); +/** + * read next carbonRow from data + * @return carbonRow object of one row + */ +jobject readNextRow(); ``` ``` - /** - * read Next Batch Row - * - * @return rows - */ - jobjectArray readNextBatchRow(); +/** + * read Next Batch Row --- End diff -- > Please optimize this one. thanks for review, I can not get your point, --- |
In reply to this post by qiuchenjian-2
Github user chenliang613 commented on the issue:
https://github.com/apache/carbondata/pull/3030 LGTM --- |
In reply to this post by qiuchenjian-2
Github user lamber-ken commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/3030#discussion_r244478288 --- Diff: docs/csdk-guide.md --- @@ -172,361 +174,360 @@ release the memory and destroy JVM. ## API List ### CarbonWriter ``` - /** - * create a CarbonWriterBuilder object for building carbonWriter, - * CarbonWriterBuilder object can configure different parameter - * - * @param env JNIEnv - * @return CarbonWriterBuilder object - */ - void builder(JNIEnv *env); -``` - -``` - /** - * Sets the output path of the writer builder - * - * @param path is the absolute path where output files are written - * This method must be called when building CarbonWriterBuilder - * @return updated CarbonWriterBuilder - */ - void outputPath(char *path); -``` - -``` - /** - * sets the list of columns that needs to be in sorted order - * - * @param argc argc argument counter, the number of projection column - * @param argv argv is a string array of columns that needs to be sorted. - * If it is null or by default all dimensions are selected for sorting - * If it is empty array, no columns are sorted - */ - void sortBy(int argc, char *argv[]); -``` - -``` - /** - * configure the schema with json style schema - * - * @param jsonSchema json style schema - * @return updated CarbonWriterBuilder - */ - void withCsvInput(char *jsonSchema); -``` - -``` - /** - * Updates the hadoop configuration with the given key value - * - * @param key key word - * @param value value - * @return CarbonWriterBuilder object - */ - void withHadoopConf(char *key, char *value); -``` - -``` - /** - * To support the table properties for writer - * - * @param key properties key - * @param value properties value - */ - void withTableProperty(char *key, char *value); -``` - -``` - /** - * To support the load options for C++ sdk writer - * - * @param options key,value pair of load options. - * supported keys values are - * a. bad_records_logger_enable -- true (write into separate logs), false - * b. bad_records_action -- FAIL, FORCE, IGNORE, REDIRECT - * c. bad_record_path -- path - * d. dateformat -- same as JAVA SimpleDateFormat - * e. timestampformat -- same as JAVA SimpleDateFormat - * f. complex_delimiter_level_1 -- value to Split the complexTypeData - * g. complex_delimiter_level_2 -- value to Split the nested complexTypeData - * h. quotechar - * i. escapechar - * - * Default values are as follows. - * - * a. bad_records_logger_enable -- "false" - * b. bad_records_action -- "FAIL" - * c. bad_record_path -- "" - * d. dateformat -- "" , uses from carbon.properties file - * e. timestampformat -- "", uses from carbon.properties file - * f. complex_delimiter_level_1 -- "$" - * g. complex_delimiter_level_2 -- ":" - * h. quotechar -- "\"" - * i. escapechar -- "\\" - * - * @return updated CarbonWriterBuilder - */ - void withLoadOption(char *key, char *value); +/** + * create a CarbonWriterBuilder object for building carbonWriter, + * CarbonWriterBuilder object can configure different parameter + * + * @param env JNIEnv + * @return CarbonWriterBuilder object + */ +void builder(JNIEnv *env); +``` + +``` +/** + * Sets the output path of the writer builder + * + * @param path is the absolute path where output files are written + * This method must be called when building CarbonWriterBuilder + * @return updated CarbonWriterBuilder + */ +void outputPath(char *path); +``` + +``` +/** + * sets the list of columns that needs to be in sorted order + * + * @param argc argc argument counter, the number of projection column + * @param argv argv is a string array of columns that needs to be sorted. + * If it is null or by default all dimensions are selected for sorting + * If it is empty array, no columns are sorted + */ +void sortBy(int argc, char *argv[]); +``` + +``` +/** + * configure the schema with json style schema + * + * @param jsonSchema json style schema + * @return updated CarbonWriterBuilder + */ +void withCsvInput(char *jsonSchema); +``` + +``` +/** +* Updates the hadoop configuration with the given key value --- End diff -- > Please optimize itã thanks :+1: --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/3030 Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder2.1/2080/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/3030 Build Success with Spark 2.2.1, Please check CI http://95.216.28.178:8080/job/ApacheCarbonPRBuilder1/2285/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/3030 Build Success with Spark 2.3.2, Please check CI http://136.243.101.176:8080/job/carbondataprbuilder2.3/10334/ --- |
Free forum by Nabble | Edit this page |