1. ritesh desai
  2. PowerBuilder
  3. Friday, 19 March 2021 04:29 AM UTC

Currently I have synchronized 26k data from local to cloud using following method

  • Create two transaction objects.
  • First connect to local db file (io_to_source) 
  • Second connect to cloud db ( io_to_target - aws mysql )

 

  • Retrieve data in datastore using local db

ds_data.SetTransObject ( io_to_source )
ds_data.retrieve()
// Switch DataStore to the target database.
ds_data.SetTransObject ( io_to_target )

for ll_row = 1 to ds_data.rowcount()
ds_data.SetItemStatus ( ll_row, 0, primary!, NewModified! )
next

ds_data.update()
commit using io_to_target ;

////////////

It works but take a lot of time.

Will it help if I create REST API in reducing the time?

And also is there a way for creating REST API service that supports gzip data?

Sivaprakash BKR Accepted Answer Pending Moderation
  1. Friday, 26 March 2021 06:30 AM UTC
  2. PowerBuilder
  3. # 1

Probably with third party tools this could be achieved, like Symmetric DS [ https://www.symmetricds.org/].  It's useful in two way synchronization also.  You could check whether it suits your requirements.

 

Comment
There are no comments made yet.
Chris Pollach @Appeon Accepted Answer Pending Moderation
  1. Friday, 19 March 2021 18:22 PM UTC
  2. PowerBuilder
  3. # 2

Hi Ritesh;

  In my past experiences as a DBA , I try & used the DBMS vendors Unload/Re-Load and/or Export/Import utility features to do that. If the DBMS's are from the same vendor - then they might even support an Auto-Sync feature (like in ASE). That would be even better IMHO.

  However, for the PB shop that wants (needs) to perform this through a PB App, I still like PB's great "PipeLine" object to perform this functionality. I have also done this with the PL object 100's of times in the past with pretty much any DBMS combination out there.  Just my $0.02.   ;-)

Regards ... Chris

Comment
There are no comments made yet.
Miguel Leeuwe Accepted Answer Pending Moderation
  1. Friday, 19 March 2021 11:14 AM UTC
  2. PowerBuilder
  3. # 3

Hi,

I don't know if things have changed for pb2019 R3, but at the time of R2, this was the situation: Please see:

https://community.appeon.com/index.php/qna/q-a/httpclient-post-methods-and-gzip-compression

I think you might see an improvement in performance (but not sure) if you could send over the data to an API that takes care of the update. Doing the update from the client - like you do now - will probably need more connections to the AWS server. Just try it.

regards

Comment
  1. ritesh desai
  2. Friday, 19 March 2021 17:42 PM UTC
Thanks I will try it
  1. Helpful
There are no comments made yet.
  • Page :
  • 1


There are no replies made for this question yet.
However, you are not allowed to reply to this question.
We use cookies which are necessary for the proper functioning of our websites. We also use cookies to analyze our traffic, improve your experience and provide social media features. If you continue to use this site, you consent to our use of cookies.