1. Ron Calder
  2. PowerBuilder
  3. Wednesday, 16 October 2019 17:20 PM UTC

I have an Excel workbook with 24 columns by 25,203 rows that are in a tree structure.

I'm using PB 2019 to open the Excel workbook read each row. Right now I'm just inserting the rows into the Oracle table and then reading the next row in a loop.  I have noticed as I read each row the amount of memory consumed by my application or if I run it in the IDE it keeps growing in size.  Once I get to around 1.7GB The IDE or the App shuts down with no error message. I have not had an issue like this before reading large workbooks.  

Any ideas for me to look at?  Would a garbagecollect call help every so many rows processed?


Thank you in advance,


Géza Bartha Accepted Answer Pending Moderation
  1. Friday, 18 October 2019 14:32 PM UTC
  2. PowerBuilder
  3. # 1



To read excel, need excel odbc driver 

trans.DBMS = "ODBC"
trans.AutoCommit = False
trans.DBParm = "ConnectString='DSN=ODBCTOEXCELFILE;UID=;PWD='"


trans.DBParm="ConnectString='DSN=Excel Files;DBQ=x:\path\excelfile.xlsx'"

And you can use old cursor solution.


select "excel first col header name", "excel second col header name" from "pagename$" using trans;

//the end of pagename need $ sign

//the excel first row contains header of col.

// Declare a destination variable for fields
string ls_field1, ls_field2
// Execute the SELECT statement with the
OPEN cursorname;
// Fetch the first row from the result set.
FETCH cursorname INTO :ls_field1, :ls_field2;
// Loop through result set until exhausted.
DO WHILE trans.sqlcode = 0
// Display a message box with the employee name.
MessageBox("Found an field1!",ls_field1)

// can do anything with variables.
// Fetch the next row from the result set.
FETCH cursorname INTO :ls_field1, :ls_field2;
// All done; close the cursor.

CLOSE cursorname;







There are no comments made yet.
Luiz Ribeiro Accepted Answer Pending Moderation
  1. Thursday, 17 October 2019 16:45 PM UTC
  2. PowerBuilder
  3. # 2

Hi Ron.


Here is an example about how to load excel data into the database:


uf_exceltodatastore (
OleObject aoo_sheet,
Datastore ads_data,
String as_range,
Integer ai_totlines,
Integer ai_totcolumns) -> None

aoo_sheet: active sheet
ads_data: datastore used to update database
as_range: initial position
ai_totlines: number of lines
ai_totcolumns: number of columns


STRING ls_ClipBoard

// store the current clipboard
ls_clipBoard = Clipboard()

// copy the active sheet data to the clipboard

// paste the clipboard into the datastore

// update the data

// restore the original clipboard





There are no comments made yet.
Chris Pollach @Appeon Accepted Answer Pending Moderation
  1. Wednesday, 16 October 2019 17:23 PM UTC
  2. PowerBuilder
  3. # 3

Hi Ron;

  Is this interaction with Excel via a direct OLE mechanism?

Regards ... Chris

  1. Chris Pollach @Appeon
  2. Wednesday, 16 October 2019 18:22 PM UTC
Hi Ron;

That is great news!

All the best ... Chris
  1. Helpful
  1. Ron Calder
  2. Thursday, 17 October 2019 11:12 AM UTC

I added the garbagecollect() function in the loop that reads the rows from excel using the ole commands and that solved the memory problem with no loss in speed when compiled as a 32bit app. The application stays at 27.1MB and never grows large in task manager. The 64bit compiled version is much slower when it runs.

Cheers, Ron ;-)
  1. Helpful
  1. Chris Pollach @Appeon
  2. Thursday, 17 October 2019 16:08 PM UTC
Hi Ron .. Thank you for that valuable feedback. I am really surprised that the 64bit EXE ran that much slower ... weird! Glad that you found that the GarbageCollect() command did the trick memory wise. :-)
  1. Helpful
There are no comments made yet.
  • Page :
  • 1

There are no replies made for this question yet.
However, you are not allowed to reply to this question.