I have an Excel workbook with 24 columns by 25,203 rows that are in a tree structure.
I'm using PB 2019 to open the Excel workbook read each row. Right now I'm just inserting the rows into the Oracle table and then reading the next row in a loop. I have noticed as I read each row the amount of memory consumed by my application or if I run it in the IDE it keeps growing in size. Once I get to around 1.7GB The IDE or the App shuts down with no error message. I have not had an issue like this before reading large workbooks.
Any ideas for me to look at? Would a garbagecollect call help every so many rows processed?
Thank you in advance,
Ron