- David Motty
- PowerBuilder
- Tuesday, 3 September 2019 06:58 PM UTC
Hi!
than 300 M in size? Currently, any attempt to do this just results in an error (my test file is 450M)… I know that creates a very big object in memory, but the function is supposed to be able to take a blob as an argument and a blob is really big. So actually, this function cannot really handle a blob (at least not big ones).
Our application has to read files from the users computer and store them into the database for other users to access. Most of the files are small, but some files are 500M+ video files. I understand that this issue can be worked around by taking chunks of the file and writing part of the file at a time to the database in each pass through a loop (which is what I have to do in my current version (2017R3))… But I was wondering if this limitation will this be fixed in a future version? Or will the workaround of breaking the file down into chunks, and updating database in a loop, still be necessary? We try to avoid any manipulation of the files in any way as they may be needed as evidence and are not supposed to be manipulated (even though this is not really manipulation, and objection could be raised)...
Also, will the exact size limitation on FileReadEx be published, etc?
Thanks in advance for your time.
Dave...
Find Questions by Tag
Helpful?
If a reply or comment is helpful for you, please don’t hesitate to click the Helpful button. This action is further confirmation of their invaluable contribution to the Appeon Community.