1. Olaf Bönning
  2. PowerBuilder
  3. Friday, 19 January 2024 10:51 AM UTC

I like to log the begin and the end of an SQL statement because i like to measure the time the SELECT needed.

In datawindows i have the retrievestart and retrieveend events where i can exactly do what i wish to do.

In embedded SQL i use SQLCA which is defined by an transaction object (n_tr). There is no event like a retrievestart/retrieveend. The only event provided is the sqlpreview. Instead of embedded SQL i could implement and call an of_execute(as_sqlstatement). The problem is the huge amount of embedded SQL in our application. Like tens of thousands.

 

Anybody out there with a breathtaking idea?

Thank you very much in advance!

Olaf

Olaf Bönning Accepted Answer Pending Moderation
  1. Tuesday, 23 January 2024 09:03 AM UTC
  2. PowerBuilder
  3. # 1

Our logging: example

Attachments (1)
Comment
There are no comments made yet.
Kevin Ridley Accepted Answer Pending Moderation
  1. Monday, 22 January 2024 16:45 PM UTC
  2. PowerBuilder
  3. # 2

You could try "rolling your own" by creating an nvo that executes your dynamic sql.  Create a logging function to go along with something like an of_execute(sql name, sql).  First line of the function can write to the logger the name of the sql and the timestamp, then execute the sql and log the timestamp when it finishes.  Not perfect but could be helpful.

Comment
  1. Olaf Bönning
  2. Tuesday, 23 January 2024 08:42 AM UTC
Kevin: thanks for the suggestion. Problem is that we have a huge application with tens of thousand embedded SQL statements and i do not want to change the sourcecode in this manner. That'll be just to "dangerous".
  1. Helpful
There are no comments made yet.
Chris Pollach @Appeon Accepted Answer Pending Moderation
  1. Saturday, 20 January 2024 19:47 PM UTC
  2. PowerBuilder
  3. # 3

Hi Mike;

  While the DBMS monitoring is a good suggestion, it only lets you see one side of the DML picture. It does not account for the client side handling of the result set & the effects that has on the PB App's packet handling, memory. CPU , Etc. ;-)

Regards ... Chris 

Comment
  1. mike S
  2. Monday, 22 January 2024 19:43 PM UTC
I prefer to either eliminate the database or eliminate the application (including drivers) by getting and seeing exactly what the database is running.



Most of the time the issue is either bad sql entered by the user (or by c#!) or a new index is needed.
  1. Helpful
There are no comments made yet.
mike S Accepted Answer Pending Moderation
  1. Saturday, 20 January 2024 15:13 PM UTC
  2. PowerBuilder
  3. # 4

why not use the database's tools for monitoring performance? 

sql server has profiler, i know oracle has something similar.  most databases either have one or possibly work with a 3rd party tool.

Comment
  1. Miguel Leeuwe
  2. Saturday, 20 January 2024 19:15 PM UTC
or put "TRACE " in front of what's in your sqlca.dbms.
  1. Helpful
  1. Olaf Bönning
  2. Tuesday, 23 January 2024 08:52 AM UTC
@Mike: in general i would say: Yes! Monitoring the database server is the right thing to do.

But there is a big BUT: First of all, we have a few hundred customers using our application. Means i needed access to all of there servers. This is not the case. Furthermore there are three different database systems involved: Informix, MySQL and Oracle. Those in different versions. Logging possibilities work different and we do not have the knowledge to handle them all.

The use case for client logging is the following example: Customer calls: "Please help: doing xxx needs very long today. Can you check?" --> Alright! My turn now: i access the terminal server at the client side or a team viewer session or whatever and open my client logging. I see hundreds of logging entries including the sql statements (except empbedded SQL), I see the execution times and long transactions turn up in red. I use Microsoft DebugView for this. Excellent small tool.
  1. Helpful
  1. mike S
  2. Tuesday, 23 January 2024 13:59 PM UTC
gotcha. makes sense. perhaps add these details next time to your question since they are very relevant.



ISV vs corporate development are very different requirements!
  1. Helpful
There are no comments made yet.
Chris Pollach @Appeon Accepted Answer Pending Moderation
  1. Friday, 19 January 2024 16:02 PM UTC
  2. PowerBuilder
  3. # 5

Hi Olaf;

  I have had this in my STD framework for about a decade now. The framework tracks all sorts of aspects around any DC, DS & their associated DWO's at runtime. For example ...

  At the end of App execution (or any time it's running) you can run other Framework DWO's that tell you how the App is (or did) perform ...

  BTW: The STD Framework is free & open source. Feel free to check it out in the above link and either use the framework or copy the code that you find interesting over to your App(s) as required.  HTH

Regards ... Chris

Comment
  1. Roland Smith
  2. Monday, 22 January 2024 01:49 AM UTC
The transaction object does have a sqlpreview and dberror events but not one that comes after. It might be better to do tracking on the database server, that way it is all in one place, not on individual users machine. If your server doesn't have the proper tools, I'm sure you can find a 3rd party package to do it.
  1. Helpful
  1. Miguel Leeuwe
  2. Monday, 22 January 2024 05:13 AM UTC
Hi @Chris:

Just to repeat info: The OP wants not only to trace the DW and DS execution times, but also the timing of embedded SQL. Like "How long did this sql to execute?".

Quote: "because i like to measure the time the SELECT needed."

regards.
  1. Helpful
  1. Chris Pollach @Appeon
  2. Monday, 22 January 2024 14:19 PM UTC
Ideally, both client & server monitoring implementations would be ideal but in reality, you do - what you can do. ;-)
  1. Helpful
There are no comments made yet.
  • Page :
  • 1


There are no replies made for this question yet.
However, you are not allowed to reply to this question.