A daily CSV file is import into the Oracle database via data-window File Import
and then saved to the database via DW Update. (PB2017R2) Simple, but not fast.
The file has about 105K rows and in the office takes around 15 minutes, which
the users have accepted and plan their day for.
Working from home, which we are all doing, the import takes 90 minutes.
Obviously due to the now extended network and the users are not happy.
So I am looking for a solution without having to completely rewrite the process
(not that there is much to the process).
I do not have permissions to write stored procs in the data base (oh my, too dangerous),
and I do not have an Oracle BCP tool nor access to Oracle APEX, so I am looking for what
I can do in PB using DWs.
The table being loaded has 4 primary keys (don't get me started), and the table
is truncated by in-line delete before saving the DW; but the commit is not made
until the DW update.
I am pretty sure PB DW update performs overhead for data verification before
sending to the database, looking for such issues as "row already changed", etc.
All rows in the DW are new, but I am not sure if Insert triggers these checks too.
If so, will committing the table truncation before the update speed up the process?
Then of course, I would lose the ability to fall back upon failure. Worth the risk if it works.
Also, I have never had a "duplicate key" error from the incoming data, so for this DW
if I add a computed dummy key and tell the DW that it is the primary key and is not
updateable (so it is not saved to the database) in the update properties, would that
eliminate extra overhead by PB?
Thoughts, Ideas, Suggestions?