I have an Excel workbook with 24 columns by 25,203 rows that are in a tree structure.
I'm using PB 2019 to open the Excel workbook read each row. Right now I'm just inserting the rows into the Oracle table and then reading the next row in a loop. I have noticed as I read each row the amount of memory consumed by my application or if I run it in the IDE it keeps growing in size. Once I get to around 1.7GB The IDE or the App shuts down with no error message. I have not had an issue like this before reading large workbooks.
Any ideas for me to look at? Would a garbagecollect call help every so many rows processed?
Thank you in advance,
Ron
That is great news!
All the best ... Chris
I added the garbagecollect() function in the loop that reads the rows from excel using the ole commands and that solved the memory problem with no loss in speed when compiled as a 32bit app. The application stays at 27.1MB and never grows large in task manager. The 64bit compiled version is much slower when it runs.
Cheers, Ron ;-)