Solution: Use the RowsDiscard() funtion.
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
PB2019R3, build 2703
Windows 10, 64 bit platform
Small utility, non-PFC
We found a way to read in large chunks of data. We selected 20,000 rows at a time to read.
STRING ls_NULL[], ls_SOURCE_DATA[], ls_status
// Init the data for the next group of rows
ls_SOURCE_DATA = ls_NULL
// Expand the target DW_INPUT's storage capacity
ls_status = dw_input.Modify ("datawindow.storagepagesize='LARGE'")
LOOP
// Grab the next group of rows into the buffers
ls_SOURCE_DATA = dw_source.Object.SOURCE_DATA [ll_row_start, ll_row_end]
<< process data in the string array>>
// Save the analyzed and possibly updated data into DW_INPUT
dw_input.Object.SOURCE_DATA [ll_target_start,ll_target_end] = ls_SOURCE_DATA
END LOOP
This works very well, but with REALLY big files we apparently run out of space, and the entire app just vanishes.
My thought was to try a block DELETE, but I want the data actually gone from the DW, not simply moved into the Delete! buffer.
The plan: read 20,000 rows from dw_source
process the data in the string variable
write the data to dw_input
delete the data from dw_source
ll_rc = dw_source.rowsmove (ll_row_start, ll_row_end, Primary!, dw_source, 1, Delete!)
every 100,000 rows call GarbageCollect
This plan failed. In fact, the code died well prior to the spot at which it was abending previously. Thus it seems to me that the technique used above did NOT cause the rows to actually be removed from the DW.
Is there a way to do a block DELETE?
DW_source is an external dw and thus is not updateable.
When I do the delete one row at a time using
ll_rc = dw_source.DeleteRow (ll_row)
deleting 20,000 rows takes FOREVER!
Thanks again, John!