1. Dhanya K
  2. PowerBuilder
  3. Wednesday, 29 June 2022 12:38 PM UTC

Hello Team,


We use PowerBuilder 2019R2 for our application. It is an application that generates reports. The report can be saved in CSV also. It was done by using Datawindow.Saveas initially. But when we got a requirement to change the delimiter in csv, we changed saveas to SaveasFormattedText. The function works for small result sets But the column order in the CSV changes for huge result sets. Could you please tell us how can we solve this issue.

Accepted Answer
John Fauss Accepted Answer Pending Moderation
  1. Wednesday, 29 June 2022 13:13 PM UTC
  2. PowerBuilder
  3. # Permalink

Greetings, Dhanya -

That sounds to me like a bug. I suggest you report it: 


Including a reproducible test case/example will be very helpful.

Regards, John

There are no comments made yet.
Dhanya K Accepted Answer Pending Moderation
  1. Friday, 1 July 2022 09:09 AM UTC
  2. PowerBuilder
  3. # 1

Thank you John for the info,

We reported it as a bug and the response we received is that the issue persists and exists also in PB 12.6. It seems that currently there is no workaround for it.

  1. Miguel Leeuwe
  2. Saturday, 2 July 2022 04:26 AM UTC
Hi Chis, yes that was my initial though too, but ... how do you distinguish the ',' separators from any comma's within any field contents before replacing them?

As for 32 kb chunks, can't you just use a FileReadEx() and manipulate the blob? (of course there's still a limit).
  1. Helpful
  1. Chris Pollach @Appeon
  2. Saturday, 2 July 2022 13:30 PM UTC
Yes, you would need to have a little smarts in the utility app to replace only the commas where appropriate.

That's a good question about the FileReadEX() command. That would work well on small files but blow the PB App out of the water memory wise on very large files. Whereas the FileRead() & FileWrite() commands will allow you to process super large files in 32K chunks. I wrote such a utility for a PB customer to process Bell Canada files > 2G (even 10G+) and this technique worked flawlessly. ;-)
  1. Helpful 1
  1. Miguel Leeuwe
  2. Sunday, 3 July 2022 00:05 AM UTC
Yes, it truly is a shame that there's no guarantee that even if you have enough free memory on the PC, powerbuilder will be (or not) able to address it.

As I've understood, PB is pretty random in assigning "it's own heap", so indeed, when having files bigger than maybe 200 or 400mB things start blowing up. Once again, I think a C# DLL might help out enormeously on doing stuff like this (without the problem of the memory allocation).

32 KB blocks might be solid, but it's just sooooo much slower !


  1. Helpful
There are no comments made yet.
  • Page :
  • 1

There are no replies made for this question yet.
However, you are not allowed to reply to this question.