Loading... |
Reply to author |
Edit post |
Move post |
Delete this post |
Delete this post and replies |
Change post date |
Print post |
Permalink |
Raw mail |
This post was updated on Dec 10, 2018; 1:38pm.
Hi all,
CarbonData has already supported integration with spark streaming and structured streaming.For streaming warehousing needs, CarbonData should integrate with Flink streaming.I create new feature issue of flink-integration with CARBONDATA-3130 <https://issues.apache.org/jira/browse/CARBONDATA-3130> .Implement of Flink integration is that creating flink directory to separated integration and CarbonToFlinkAdapter object under integration directory.Do anyone have suggestion with flink streaming integration? -- Sent from: http://apache-carbondata-dev-mailing-list-archive.1130556.n5.nabble.com/ |
Loading... |
Reply to author |
Edit post |
Move post |
Delete this post |
Delete this post and replies |
Change post date |
Print post |
Permalink |
Raw mail |
+1,
there are some users require support this feature, we can implement it. -- Sent from: http://apache-carbondata-dev-mailing-list-archive.1130556.n5.nabble.com/ |
Loading... |
Reply to author |
Edit post |
Move post |
Delete this post |
Delete this post and replies |
Change post date |
Print post |
Permalink |
Raw mail |
In reply to this post by Nicholas
Hi,guys!
I think it should include two aspects. Write and read. 1. Write data from Flink to carbondata 2. Extract data from carbondata and write to Flink table. If you only think about how to write carbondata, you'll just need to implement write Flink soure and sink need to be customized, and carbondata API implementation needs to be called. http://carbondata.apache.org/sdk-guide.html Flink customized the reference links of source and sink as follows: https://ci.apache.org/projects/flink/flink-docs-master/dev/table/sourceSinks.html Type mapping of flink table and carbondata also needs to be considered. Flink table data type: https://ci.apache.org/projects/flink/flink-docs-master/dev/table/tableApi.html#data-types carbondata http://carbondata.apache.org/supported-data-types-in-carbondata.html Table format support Carbondata table format: CSV, json, text, parquet, kafka, socket http://carbondata.apache.org/streaming-guide.html Flink table format: CSV,json,avro https://ci.apache.org/projects/flink/flink-docs-master/dev/table/connect.html#table-formats -- Sent from: http://apache-carbondata-dev-mailing-list-archive.1130556.n5.nabble.com/ |
Loading... |
Reply to author |
Edit post |
Move post |
Delete this post |
Delete this post and replies |
Change post date |
Print post |
Permalink |
Raw mail |
In reply to this post by Nicholas
Hi,flink integration designs shown in the diagrams.
<http://apache-carbondata-dev-mailing-list-archive.1130556.n5.nabble.com/file/t306/Flink_Integration.jpg> <http://apache-carbondata-dev-mailing-list-archive.1130556.n5.nabble.com/file/t306/Graph.png> <http://apache-carbondata-dev-mailing-list-archive.1130556.n5.nabble.com/file/t306/Flink_SQL.jpg> -- Sent from: http://apache-carbondata-dev-mailing-list-archive.1130556.n5.nabble.com/ |
Loading... |
Reply to author |
Edit post |
Move post |
Delete this post |
Delete this post and replies |
Change post date |
Print post |
Permalink |
Raw mail |
Yes, I think we can integrate like this by using SDK. But before that Carbon should support transactional table in SDK. I have sent another mail to describe the idea. (Support transactional table in SDK).
Regards, Jacky > 在 2018年12月6日,上午8:44,Nicholas <[hidden email]> 写道: > > Hi,flink integration designs shown in the diagrams. > <http://apache-carbondata-dev-mailing-list-archive.1130556.n5.nabble.com/file/t306/Flink_Integration.jpg> > <http://apache-carbondata-dev-mailing-list-archive.1130556.n5.nabble.com/file/t306/Graph.png> > <http://apache-carbondata-dev-mailing-list-archive.1130556.n5.nabble.com/file/t306/Flink_SQL.jpg> > > > -- > Sent from: http://apache-carbondata-dev-mailing-list-archive.1130556.n5.nabble.com/ > |
Free forum by Nabble | Edit this page |