Hi.
We have a small application (PB2017 R3) which reads and interprets small text files, adds rows to 3-4 datastores and the commits to 4 tables in the database. Sometimes it can be as much as 1500 rows, but for the most between 700 and 1000.
There is a trigger on one of the tables defined as "FOR EACH ROW" which writes some additional logging to another table (for after update/insert/delete).
It looks like the trigger fires for each row inserted by the datastore, which is of course correct when it's defined like that.
Is it perhaps an idea to alter the trigger to "FOR EACH STATEMENT" so that all the rows in the main table is commited before the logging is done by the trigger?
Any pros and/or cons?
Regards,
Bjarne Anker
I guess we'll try to refine the code a bit to avoid to much transactions to the production tables.
Thanks!