We tried to transfer data from SQL Server to PostgreSQL using both Datastore and JSON
System Used: Intel Core I7-7700 8GB RAM,
Software: PowerBuilder 2022, SQL Server 2012 Express, PostGreSQL 15
Table: Need to transfer 10 Tables [ 7 Master Data tables and 3 Transaction Data Tables ]
One particular transaction data table for which we tried to transfer using both datastore and JSON. That particular table had 23 columns and 97821 rows.
To Replace: Identity column in SQL Server to GUID in PostgreSQL by seeking data from corresponding master data table
We used u_json available in the community forum, to read JSON string.
Steps [ JSON ]
1. Read data into a datawindow
2. Export to JSON
3. Call a NVO to do further processing with the above JSON as a parameter
4. Retrieve the required master data into a datastore
5. Load the JSON - Much time taking process. We tried and abandoned as we waiting for more than 20 min without success
6. Read each row, find corresponding GUID value for the identity column, by using Find command in the master data datastore.
7. Insert into a new datastore with all required values from JSON
8. Update the datastore.
With this load, the process didn't pass the loadstring command, with out wait for 20-30 min in different run.
So we replaced JSON with datastore and tried to process the same. Same steps as above, with a change in Step 5. Instead of reading from JSON, we read each data in the datastore and processed. The transfer work got started immediately, as there was no loadstring command.
The process got slowed down once we reached around 5000 records. With every addition of 1000 records, we could see its getting to slow down further. IN 30 min, only 50000 records or so got transferred.
So we stopped that process too. We didn't proceed with both, instead created a temp table in the database, and carried out the transfer. The transfer of the said table took only 5 min.
Happiness Always
BKR Sivaprakash