1. Olan Knight
  2. PowerBuilder
  3. Tuesday, 7 July 2020 17:03 PM UTC

PB2019R2
Windows 10, 64 bit platform


Summary:
   The dw.ImportFile() command is returning -15 on files less that 100K in size


Details:
   This commnd is part of the simple, one PBL utility we've been testing as the first converted code from PBv12.1.

   We have text files that can range in size from 1KB up, with any number rows from 1 to over a million.

   In PBv12.1, this works in the executable, but not in debug.
   In PB2019, this does not work in either the EXE or in debug.
      We've tried:      ll_rc = dw_input.ImportFile (Text!, ls_file)
                             ll_rc = dw_input.ImportFile (Text!, ls_file, 1, 1000)
                             ll_rc = dw_input.ImportFile (Text!, ls_file, 1, 10, 1, 1)
                             ll_rc = dw_input.ImportFile (Text!, ls_file, 1, 10, 1, 1, 1)
  

   Meanwhile, other files that are MUCH larger occasionally seem to work.

Q1. Am I doing something incorrectly?

Q2. Is there a way to make large files importable in PB2019 using dw.ImportFilew()?

Q3. Is there a replacement or substitute command/call/function for dw.ImporFile in PB2019 that will handle large text files?


NOTE:  We are currently experimenting with FileReadEx().


Thank You,

Olan

Miguel Leeuwe Accepted Answer Pending Moderation
  1. Wednesday, 8 July 2020 11:26 AM UTC
  2. PowerBuilder
  3. # 1

If everything else fails, what about this?

- use fileReadEx

- convert the blob to string

- assign the string to an Any variable

- assign the any variable to dw_1.object.Data

Never tried it, but might work?

 

regards

Comment
  1. Olan Knight
  2. Wednesday, 8 July 2020 13:40 PM UTC
Absolutely worth a try. Thanks for the idea!
  1. Helpful
  1. Miguel Leeuwe
  2. Wednesday, 8 July 2020 14:29 PM UTC
YW, it just something that might work, maybe there are better suggestions made by others:



You'd have to assign somehow all obtained data to an array first I guess.

See for more information on how to work with Data here: http://www.angelfire.com/home/jasonvogel/pb_technique_data_and_dot_notation.html

cheers
  1. Helpful
There are no comments made yet.
Olan Knight Accepted Answer Pending Moderation
  1. Tuesday, 7 July 2020 18:36 PM UTC
  2. PowerBuilder
  3. # 2

OK....

1. Import into a DataStore instead of a DataWindow Control (DWC) to see if the visual component is causing the issue.
Did that, no difference.

2. Ensure the DataWindow (object) does not contain a sort specification. If sorting is needed, apply the sort format and sort the data after import.
Will look.

3. Try an external source DataWindow (object) to see if the database connection/component is causing the issue.
Will try that.

4. Verify that the column specifications in the DataWindow (object), namely data type and (string) length are correct and that there is no chance for truncation to occur during the import.
Did that.

5a. How are the fields delimited? Commas? Tabs?
This is a simple text file with a single 210-byte column in it.

5b. Can a file that fails to import be imported into Excel, as an example?
Will try that.

6. Have you search the PB bug reports for any known problems of a similar nature?
Yes, and I've looked through The Purveyor Of All Knowledge: Google. The best suggestion I could find was the FileEx one.


Thank you, John!

Comment
There are no comments made yet.
Chris Pollach @Appeon Accepted Answer Pending Moderation
  1. Tuesday, 7 July 2020 18:06 PM UTC
  2. PowerBuilder
  3. # 3

Hi Olan;

  That is not an uncommon issue for PB Apps when you are processing 1+M rows. Yes, the -15 indicates that you ran out of memory to continue (aka File Size is too big).

  When I have had this challenge in the past, I "chucked" the import by using FileRead (and not FileReadEX command). In the FileRead command, I can bring in 32,765 bytes of datum, update a DWO within a DS/DC, post the updates to the DBMS (if required), commit the reset the DWO buffers and then loop back for the next 32,765 bytes. Then repeat this process until the 1+M rows of data has processed. For restarting, I keep the last file position in an IN file for a restart-continue (if required).

Hey .. aren't you using the Oracle DBMS?

Regards ... Chris

Comment
  1. Olan Knight
  2. Tuesday, 7 July 2020 18:15 PM UTC
Ah, instead of line by line, get chunks O data at a time. Thanks for the tip!

And yes, we are an Oracle shoipe until we complete the migration to PostgreSQL.
  1. Helpful
  1. Chris Pollach @Appeon
  2. Tuesday, 7 July 2020 18:27 PM UTC
FWIW: I would investigate the Oracle "External Table" approach to importing external files. I have used this before and its Super, Super fast without any PB App interaction required.

FYI: https://docs.oracle.com/cd/B19306_01/server.102/b14215/et_concepts.htm

Food for thought #2.
  1. Helpful
There are no comments made yet.
John Fauss Accepted Answer Pending Moderation
  1. Tuesday, 7 July 2020 17:53 PM UTC
  2. PowerBuilder
  3. # 4

Hi, Olan - 

Here are some ideas I would try if I were in your situation, just to try and ferret out the cause(s):

1. Import into a DataStore instead of a DataWindow Control (DWC) to see if the visual component is causing the issue.
2. Ensure the DataWindow (object) does not contain a sort specification. If sorting is needed, apply the sort format and sort the data after import.
3. Try an external source DataWindow (object) to see if the database connection/component is causing the issue.
4. Verify that the column specifications in the DataWindow (object), namely data type and (string) length are correct and that there is no chance for truncation to occur during the import.
5. How are the fields delimited? Commas? Tabs? Can a file that fails to import be imported into Excel, as an example?
6. Have you search the PB bug reports for any known problems of a similar nature?

These are not, obviously, suggested permanent solutions, but they may help provide some insight.

HTH, John

Comment
There are no comments made yet.
  • Page :
  • 1


There are no replies made for this question yet.
However, you are not allowed to reply to this question.