http://apache-carbondata-dev-mailing-list-archive.168.s1.nabble.com/refresh-table-CARBON-1-3-1-tp44262p44294.html
val cc = SparkSession.builder().config(sc.getConf).
The backup table data should be copied at database location.
For more details, please refer below.
> hello,
> I have a table created and loaded under carbon1.3.0 and i'm
> upgrading to carbon1.3.1 using refresh table.
> following is my step:
> 1. copy old table's hdfs location to a new direcotry.
> /user/xx/prod_inst_cab-->/user/xx/prod_inst_cab_backup
> 2. hive -e "drop table xx.prod_inst_cab"
> 3. cc.sql("drop table xx.prod_inst_cab")
> (ps: initial carbonsession using '/user/xx' location; like val cc =
> SparkSession.builder().config(sc.getConf).getOrCreateCarbonSession("hdfs://ns1/user/xx")
> )
> 4. hdfs dfs -rmr -skipTrash /user/xx/prod_inst_cab and copy
> /user/xx/prod_inst_cab_backup --> /user/xx/prod_inst_cab
> 5.cc.sql("refresh table xx.prod_inst_cab").show as well as
> cc.sql("refresh table xx.prod_inst_cab_backup").show
>
> when perform
> 1.cc.sql("select count(1) from xx.prod_inst_cab").show result is 0
> 2.cc.sql("select count(1) from xx.prod_inst_cab_backup").show
> result is right,but all columns of all records is "null"
>
> Am i missing some details?
>
>
>
[hidden email]
>