1. Roland Smith
  2. PowerBuilder
  3. Tuesday, 27 July 2021 20:41 PM UTC

My Environment: PB 2019 Build 2170.

I have a small app that reads a file into a blob, compresses the blob using zlib and then sends it to a web service.

I am having all sorts of problems when the file is very large.

The smaller file I am testing with is 80,665,088 bytes.

I was using ReadFileEx to read it into a blob in one call. It gave me a 'object reference' error. I've never seen that with a system function.

I changed the code to use Windows API functions to read the file. Now it is able to read in and compress the file.

The larger file is 167,749,120 bytes.

PowerBuilder just disappears in the read function. This is how I allocate the read buffer:

Blob lblb_filedata
Byte lby_buffer[]
ULong lul_length

lul_length = FileLength(as_filename)

lby_buffer[lul_length] = 0
lblb_filedata = Blob(lby_buffer)
SetNull(lby_buffer)

 

Does anyone have any ideas on a different approach?

Olan Knight Accepted Answer Pending Moderation
  1. Wednesday, 28 July 2021 19:29 PM UTC
  2. PowerBuilder
  3. # 1

Roland, we found a way to read in large chunks of data. We selected 20,000 rows at a time to read. Here the source is dw_source, but the same principle should work with a FileReadEx (1, 20000)


STRING      ls_NULL[],  ls_SOURCE_DATA[], ls_status

    // Init the data for the next group of rows
    ls_SOURCE_DATA  = ls_NULL

    // Expand the target DW_INPUT's storage capacity
    ls_status = dw_input.Modify ("datawindow.storagepagesize='LARGE'")


   LOOP
       // Grab the next group of rows into the buffers
       ls_SOURCE_DATA = dw_source.Object.SOURCE_DATA  [ll_row_start, ll_row_end]

      << process data in the string array>>

       // Save the analyzed and possibly updated data into DW_INPUT
       dw_input.Object.SOURCE_DATA  [ll_target_start,ll_target_end]  = ls_SOURCE_DATA
  END LOOP


This works very well, but with REALLY big files we apparently run out of space, and the entire app just vanishes.


Good Luck,

Olan

Comment
There are no comments made yet.
Miguel Leeuwe Accepted Answer Pending Moderation
  1. Wednesday, 28 July 2021 01:41 AM UTC
  2. PowerBuilder
  3. # 2

Hi Roland,

If any good: All I know is that PB has it's own "intermediate" layer of memory allocation. The way it's being assigned is non-transparent and arbitrary. Even if you have 1GB of available RAM, it still might blow up on reading only 400MB.

FileReadEx() ( you say "readFileEx()"? I guess that's the Windows API) has a parameter that allows you to set how much data you want to read. So maybe a solution would be to read your file in a loop of 100MB chunks). It still doesn't guarantee PB would allocate enough memory for the blob you assign the data to, but it's worth the try.

regards.

Comment
  1. Miguel Leeuwe
  2. Wednesday, 28 July 2021 01:43 AM UTC
(note: if your file is bigger than 2GB, you'd have to use FileLength64() to get the length)
  1. Helpful
There are no comments made yet.
  • Page :
  • 1


There are no replies made for this question yet.
However, you are not allowed to reply to this question.
We use cookies which are necessary for the proper functioning of our websites. We also use cookies to analyze our traffic, improve your experience and provide social media features. If you continue to use this site, you consent to our use of cookies.