Posted by
Jacky Li on
Dec 10, 2018; 3:04pm
URL: http://apache-carbondata-dev-mailing-list-archive.168.s1.nabble.com/DISCUSS-Support-transactional-table-in-SDK-tp69878p70110.html
> 在 2018年12月8日,下午3:53,Liang Chen <
[hidden email]> 写道:
>
> Hi
>
> Good idea, thank you started this discussion.
>
> Agree with Ravi comments, we need to double-check some limitations after
> introducing the feature.
>
> Flink and Kafka integration can be discussed later.
> For using SDK to write new data to the existing carbondata table , some
> questions:
> 1.How to ensure to create the same index, dictionary... policy as per the
> existing table?
Likun: SDK uses the same writer provided in carbondata-core module, so it follows the same “policy” as you mentioned
> 2.Can you please help me to understand this proposal further : what valued
> scenarios require this feature?
Likun: currently, SDK writes carbondata files in a flat folder and lose all features built on top on segment concept, such as show segment, delete segment, compaction, datamap, MV, data update, delete, streaming, global dictionary, etc.
By introducing this feature (support transactional table in SDK), application can use it in a non-spark environment to write new carbondata files and still enjoy transactional table with segment support and all previous features supported.
Basically, these new APIs in SDK adds a new way to write data into an existing carbondata table. It is for non-spark environment such as Flink, Kafka-Stream, Cassandra, or any other Java application.
>
> ------------------------------------------------------------------------------------------------
> After having online segment, one can use this feature to implement
> ApacheFlink-CarbonData integration, or Apache
> KafkaStream-CarbonDataintegration, or just using SDK to write new data to
> existing CarbonData table,the integration level can be the same as current
> Spark-CarbonDataintegration.
>
> Regards
> Liang
>
>
> --
> Sent from:
http://apache-carbondata-dev-mailing-list-archive.1130556.n5.nabble.com/>