GitHub user xubo245 opened a pull request:
https://github.com/apache/carbondata/pull/2318 [CARBONDATA-2491] Fix the error when reader read twice with SDK carbonReader [CARBONDATA-2491] Fix the error when reader read twice with SDK carbonReader 1.Fix the error out of bound when reader read twice with SDK carbonReader 2.Fix the java.lang.NegativeArraySizeException 3.Add timestamp and bad record test case This PR dependency on https://github.com/apache/carbondata/pull/2246 Be sure to do all of the following checklist to help us incorporate your contribution quickly and easily: - [ ] Any interfaces changed? No - [ ] Any backward compatibility impacted? No - [ ] Document update required? No - [ ] Testing done Please provide details on - Whether new unit test cases have been added or why no new tests are required? - How it is tested? Please attach test report. - Is it a performance related change? Please attach the performance test report. - Any additional information to help reviewers in testing this change. No - [ ] For large changes, please consider breaking it into sub-tasks under an umbrella JIRA. No You can merge this pull request into a Git repository by running: $ git pull https://github.com/xubo245/carbondata CARBONDATA-2491-OutOfBoundAndBadRecord Alternatively you can review and apply these changes as the patch at: https://github.com/apache/carbondata/pull/2318.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #2318 ---- commit 8ae64ed8b3991753d2745f5e0e4a83b2bf970dd2 Author: xubo245 <xubo29@...> Date: 2018-05-17T04:13:01Z [CARBONDATA-2413] After running CarbonWriter, there is null directory about datamap if the segment id is null, then don't register all datamap writer and don't create null directory remove line commit 19043b6579d98908b1f9052b8ae4de8c9881161e Author: xubo245 <xubo29@...> Date: 2018-05-18T07:40:16Z [CARBONDATA-2491] Fix the error when reader read twice with SDK carbonReader ---- --- |
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/2318 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/4986/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2318 Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/4802/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2318 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/5959/ --- |
In reply to this post by qiuchenjian-2
Github user xubo245 commented on the issue:
https://github.com/apache/carbondata/pull/2318 retest this please --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2318 Build Failed with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/4805/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2318 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/5963/ --- |
In reply to this post by qiuchenjian-2
Github user xubo245 commented on the issue:
https://github.com/apache/carbondata/pull/2318 retest this please --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2318 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/5975/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2318 Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/4818/ --- |
In reply to this post by qiuchenjian-2
Github user jackylk commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2318#discussion_r189421068 --- Diff: core/src/main/java/org/apache/carbondata/core/datamap/DataMapStoreManager.java --- @@ -267,6 +267,8 @@ public TableDataMap getDataMap(CarbonTable table, DataMapSchema dataMapSchema) { } } } + } else { + dataMap.clear(); --- End diff -- why is it required? please add comment --- |
In reply to this post by qiuchenjian-2
Github user jackylk commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2318#discussion_r189421188 --- Diff: store/sdk/src/test/java/org/apache/carbondata/sdk/file/AvroCarbonWriterTest.java --- @@ -29,16 +29,27 @@ import org.apache.avro.generic.GenericData; import org.apache.commons.io.FileUtils; import org.apache.commons.lang.CharEncoding; +import org.junit.After; import org.junit.Assert; +import org.junit.Before; import org.junit.Test; import tech.allegro.schema.json2avro.converter.JsonAvroConverter; import org.apache.avro.Schema; - public class AvroCarbonWriterTest { private String path = "./AvroCarbonWriterSuiteWriteFiles"; + @Before + public void cleanFile() { + assert (TestUtil.cleanMdtFile()); --- End diff -- Is there another PR to remove the creation of system folder when user uses SDK to write data? --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2318#discussion_r189422955 --- Diff: store/sdk/src/test/java/org/apache/carbondata/sdk/file/AvroCarbonWriterTest.java --- @@ -29,16 +29,27 @@ import org.apache.avro.generic.GenericData; import org.apache.commons.io.FileUtils; import org.apache.commons.lang.CharEncoding; +import org.junit.After; import org.junit.Assert; +import org.junit.Before; import org.junit.Test; import tech.allegro.schema.json2avro.converter.JsonAvroConverter; import org.apache.avro.Schema; - public class AvroCarbonWriterTest { private String path = "./AvroCarbonWriterSuiteWriteFiles"; + @Before + public void cleanFile() { + assert (TestUtil.cleanMdtFile()); --- End diff -- I think pr 2246 can solve it --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2318#discussion_r189423096 --- Diff: core/src/main/java/org/apache/carbondata/core/datamap/DataMapStoreManager.java --- @@ -267,6 +267,8 @@ public TableDataMap getDataMap(CarbonTable table, DataMapSchema dataMapSchema) { } } } + } else { + dataMap.clear(); --- End diff -- I think it loses the purpose of cache if you clear for every retrieval. clear should be called only flush the cache. why do you need to flush cache for every call? --- |
In reply to this post by qiuchenjian-2
Github user xubo245 commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2318#discussion_r189480675 --- Diff: store/sdk/src/test/java/org/apache/carbondata/sdk/file/AvroCarbonWriterTest.java --- @@ -29,16 +29,27 @@ import org.apache.avro.generic.GenericData; import org.apache.commons.io.FileUtils; import org.apache.commons.lang.CharEncoding; +import org.junit.After; import org.junit.Assert; +import org.junit.Before; import org.junit.Test; import tech.allegro.schema.json2avro.converter.JsonAvroConverter; import org.apache.avro.Schema; - public class AvroCarbonWriterTest { private String path = "./AvroCarbonWriterSuiteWriteFiles"; + @Before + public void cleanFile() { + assert (TestUtil.cleanMdtFile()); --- End diff -- PR 2246 fix another problem, can't solve this one. @ravipesala --- |
In reply to this post by qiuchenjian-2
Github user xubo245 commented on a diff in the pull request:
https://github.com/apache/carbondata/pull/2318#discussion_r189480763 --- Diff: store/sdk/src/test/java/org/apache/carbondata/sdk/file/AvroCarbonWriterTest.java --- @@ -29,16 +29,27 @@ import org.apache.avro.generic.GenericData; import org.apache.commons.io.FileUtils; import org.apache.commons.lang.CharEncoding; +import org.junit.After; import org.junit.Assert; +import org.junit.Before; import org.junit.Test; import tech.allegro.schema.json2avro.converter.JsonAvroConverter; import org.apache.avro.Schema; - public class AvroCarbonWriterTest { private String path = "./AvroCarbonWriterSuiteWriteFiles"; + @Before + public void cleanFile() { + assert (TestUtil.cleanMdtFile()); --- End diff -- @jackylk This PR dependency on PR2246, the first commit it cherry-pick from PR2246. After pr2246 merged, this pr need rebase. --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2318 Build Failed with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/5985/ --- |
In reply to this post by qiuchenjian-2
Github user CarbonDataQA commented on the issue:
https://github.com/apache/carbondata/pull/2318 Build Success with Spark 2.2.1, Please check CI http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/4827/ --- |
In reply to this post by qiuchenjian-2
Github user ravipesala commented on the issue:
https://github.com/apache/carbondata/pull/2318 SDV Build Success , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/5007/ --- |
In reply to this post by qiuchenjian-2
Github user xubo245 commented on the issue:
https://github.com/apache/carbondata/pull/2318 retest this please --- |
Free forum by Nabble | Edit this page |