refresh table [CARBON-1.3.1]

classic Classic list List threaded Threaded
3 messages Options
Reply | Threaded
Open this post in threaded view
|

refresh table [CARBON-1.3.1]

孙而焓
hello,
        I have a table created and loaded under carbon1.3.0 and i'm upgrading to carbon1.3.1 using refresh table.
     following is my step:
    1. copy old table's hdfs location   to a new direcotry.  /user/xx/prod_inst_cab-->/user/xx/prod_inst_cab_backup
    2. hive -e "drop table xx.prod_inst_cab"
    3. cc.sql("drop table xx.prod_inst_cab")    
    (ps: initial carbonsession using  '/user/xx' location;  like val cc = SparkSession.builder().config(sc.getConf).getOrCreateCarbonSession("hdfs://ns1/user/xx") )
    4. hdfs dfs -rmr -skipTrash /user/xx/prod_inst_cab and copy /user/xx/prod_inst_cab_backup --> /user/xx/prod_inst_cab
    5.cc.sql("refresh table xx.prod_inst_cab").show   as well as  cc.sql("refresh table xx.prod_inst_cab_backup").show

when  perform
    1.cc.sql("select count(1) from xx.prod_inst_cab").show    result is 0
    2.cc.sql("select count(1) from xx.prod_inst_cab_backup").show    result is right,but all columns of all records is "null"

Am i missing some details?


[hidden email]
孙而焓【FFCS研究院】
Reply | Threaded
Open this post in threaded view
|

Re: refresh table [CARBON-1.3.1]

mohdshahidkhan
Hi  Sunerhan,

 If xx is database, the step 3 should be like require a minor change
val cc = SparkSession.builder().config(sc.getConf).
getOrCreateCarbonSession("hdfs://ns1/user/") )

Step 4:
The backup table data should be copied at database location.

For more details, please refer below.
*https://github.com/apache/carbondata/blob/master/docs/data-management-on-carbondata.md#refresh-table
<https://github.com/apache/carbondata/blob/master/docs/data-management-on-carbondata.md#refresh-table>*

Regards,
Mohammad Shahid Khan








On Tue, Apr 3, 2018 at 5:56 PM, [hidden email] <[hidden email]
> wrote:

> hello,
>         I have a table created and loaded under carbon1.3.0 and i'm
> upgrading to carbon1.3.1 using refresh table.
>      following is my step:
>     1. copy old table's hdfs location   to a new direcotry.
> /user/xx/prod_inst_cab-->/user/xx/prod_inst_cab_backup
>     2. hive -e "drop table xx.prod_inst_cab"
>     3. cc.sql("drop table xx.prod_inst_cab")
>     (ps: initial carbonsession using  '/user/xx' location;  like val cc =
> SparkSession.builder().config(sc.getConf).getOrCreateCarbonSession("hdfs://ns1/user/xx")
> )
>     4. hdfs dfs -rmr -skipTrash /user/xx/prod_inst_cab and copy
> /user/xx/prod_inst_cab_backup --> /user/xx/prod_inst_cab
>     5.cc.sql("refresh table xx.prod_inst_cab").show   as well as
> cc.sql("refresh table xx.prod_inst_cab_backup").show
>
> when  perform
>     1.cc.sql("select count(1) from xx.prod_inst_cab").show    result is 0
>     2.cc.sql("select count(1) from xx.prod_inst_cab_backup").show
> result is right,but all columns of all records is "null"
>
> Am i missing some details?
>
>
> [hidden email]
>
Reply | Threaded
Open this post in threaded view
|

Re: refresh table [CARBON-1.3.1]

孙而焓
hello mohdshahidkhan,
   sorry for the writing mistake and
 step3: xx is database and i did val cc =
SparkSession.builder().config(sc.getConf).getOrCreateCarbonSession("hdfs://ns1/user")
)
 step4:two table data directory like:
   /user/xx/prod_inst_cab_backup
   /user/xx/prod_inst_cab



-----
FFCS研究院
--
Sent from: http://apache-carbondata-dev-mailing-list-archive.1130556.n5.nabble.com/
孙而焓【FFCS研究院】