1. Samir Kothari
  2. PowerBuilder
  3. Wednesday, 28 September 2022 12:49 PM UTC

Hi all,

Stuck on a production issue, any suggestions or guidance is much appreciated

Our environment is PB 2017 R3. SQL Server 2016. We have about 6 users accessing a compiled application residing on a windows server and folks tend to be working from home more often connected via VPN

Manual Process - Insert into db1 and Update db2 

It seems users are not having issues making inserts/updates in there database (db1), but within the same transaction are having issues when updates are performed in a different database (db2). No errors appearing. This logic all takes place within PB, using datastores and updatable views dealing with a record at a time. 
This issue was very rare. Over the last 7-8 years it's happened a handful of times. We didn't spend much time investigating, thinking it must be some kind of network / connectivity issue and not worth putting much time into. Bad move on my part.

New Automatic Process (exists in addition to the Manual Process)- -Insert into db1 and Update db2

Business has changed, and the folks tend to spend more time working from home (not sure if it makes a difference). I've created an process that now reads a file and creates multiple records in db1, and within the same transaction updating db2. This logic takes place within a stored procedure called by PB. Sometimes those updates in db2 don't take place or only some records are successfully processed. The strange thing is that no errors are produced from SQL or PB (yeah, I do have my error handling :). So far 6 files have been loaded and two files have not had any issues. 

To make matters more challenging, I can't seem to replicate the issue in our dev or test environment. It only happens in production. 

As an example, file1 with a total of 170 records need to be fed into db1. All 170 records are inserted in db1, but only 150 updates take place in db2. 

again, and advice is welcome!




Just an update, I didn't realize on the manual process the window has to be click twice for it to be saved. I have to investigate this, wondering if this is related ( unlikely, but you never know) . I'll update this with my results

Samir Kothari Accepted Answer Pending Moderation
  1. Tuesday, 22 November 2022 14:37 PM UTC
  2. PowerBuilder
  3. # 1

Looks like all is good now, thanks for all the response.

Based on the feedback above, I did modify some of the logic related to my auto-commit and transactions, thank you for that. However as for my original issue, it was related to some business logic (where I least expected it) with a negative return code preventing the object from updating

There are no comments made yet.
John Fauss Accepted Answer Pending Moderation
  1. Wednesday, 28 September 2022 14:40 PM UTC
  2. PowerBuilder
  3. # 2

Hi, Samir -

Do the two databases reside in the same SQL Server database engine or in separate ones? It is not clear from your post.

If both reside in the same DB engine, are you accessing them in PB with a single Transaction object or do you use a separate Transaction object for each database?

If separate DB engines, are they connected to each other via SQL Server Linked Server?

Does the app encapsulate this "insert into one database and update another database" operation you have described within a database transaction (Begin Transaction/Commit or Rollback) so that SQL Server treats the whole operation as a single transaction?

  1. Samir Kothari
  2. Wednesday, 28 September 2022 22:23 PM UTC

Yes, both databases reside in the same SQL Server database engine and they are accessed with a single Transaction object.

(We also use pfc framework (forgot to mention that))

Yes, I believe we do have everything wrapped up using SQLCA.of_Begin() and SQLCA.of_Commit()

  1. Helpful
  1. John Fauss
  2. Thursday, 29 September 2022 02:43 AM UTC
Have you (1) VERIFIED that transaction management is being performed, and (2) that it is being performed correctly? I suggest you test with SQL trace (SQLCA.DBMS = "TRACE SNC", for example). Believing some behavior is working and knowing/verifying that behavior is working are two very different things.

If it you can verify transaction management is occurring and it is correct, then as a test, code the update to fail (for example, don't issue the Update and simulate an update error) to see if the Inserts in the first database are committed. Again, the SQL trace log can help you see what database activity is occurring and in what order. The Inserts should be rolled back. Additionaly, you can/should test/code a rollback after (purportedly) successful Updates to verify that both the Inserts and the Updates in both databases are rolled back.

Also, look closely at Sivaprakash's code sample. Even though his example uses two Transaction objects, the logic and transaction management appear to be sound.
  1. Helpful
There are no comments made yet.
Sivaprakash BKR Accepted Answer Pending Moderation
  1. Wednesday, 28 September 2022 13:30 PM UTC
  2. PowerBuilder
  3. # 3


Possibilities to check.

1.  We faced an issue of data loss in one of our clients' place, while updating only one database.    Randomly few data (while invoicing) will gets lost.  It happened only during peak hours, which we could not replicate at any other place.  Finally it was an hardware issue, that by replacing all switches and connectors to higher quality solved that issue.   

2.  Your update code might give us a better picture.    In one of our application we too update two database, with a variation from your scenario.  Data is entered in one single screen, where it will be split into two on a condition while saving.   Few records go to one db and remaining to another db.  This is working fine for the past one decade.  [ Developed using PB 10.5 and SQL Server 2000/2012 ].

2a.  This is the code which is working here 

Call event / function for validation

ib_save    = True                              <- To check db error first db
ib_another = True                              <- To check db error second db

SQLCA.AutoCommit   = False                     <- First database
gt_sales.AutoCommit = False                    <- Second database

If lb_proceed = True Then
   Call event to separate the data into two

   // Check whether the total rows matches	
   If dw_detail.RowCount() <> dw_main_detail.RowCount() + dw_another_detail.RowCount() Then
		MessageBox('Elifa', 'Given Item Are ' + String(dw_detail.RowCount()) + '~r~n Main is  ' +  String(dw_main_detail.RowCount()) + '~r~n Other is ' + String(dw_another_detail.Rowcount()))
		SQLCA.AutoCommit = True
		gt_sales.AutoCommit = True
		Return 0
   End If
   TriggerEvent('ue_save_main')             <- Event to update first db
   If ib_save = True Then TriggerEvent('ue_save_another')   <- Event to update second db
   If ib_save = True and ib_another = True Then       <= both update success
      Commit Using gt_sales;                          <- Commit second
      Commit Using SQLCA;                             <- commit first
      RollBack Using SQLCA;                           <- Roll back first
      RollBack Using gt_sales;                        <- Roll back second
      MessageBox('Elifa', 'Commit Failed')    
   End If	
End If
SQLCA.AutoCommit   = True
gt_sales.AutoCommit = True


Happiness Always
BKR Sivaprakash


  1. Samir Kothari
  2. Wednesday, 28 September 2022 22:25 PM UTC
I will provide my code, but I notice and recently read about using the autocommit. This is something that we have had set to true for years (yeah, this app has been around a long time). I never set it to false prior to calling my sp's. I wonder if this is making a difference
  1. Helpful
There are no comments made yet.
  • Page :
  • 1

There are no replies made for this question yet.
However, you are not allowed to reply to this question.