1. Rick Domogalik
  2. PowerBuilder
  3. Thursday, 26 May 2022 14:53 PM UTC

I am trying to copy files like Word, images, excel etc. into a table.  Store them temporarily then have a routine that extracts them back to files when the user requests.  I have defined a table with a column called file_data  - type - (varbinary(max) to put the blob in.

Here is my File to Blob function.  This seems to work for putting the file into the blob column in the table, but for some reason I think only the first 32k is being put in the table.  Even though my tests at runtime show that ll_filelen is the proper length of the file I select to import.

li_value = GetFileOpenName("Select File", ls_docname, ls_named, " ", "All Files (*.*), *.*")
ll_FileLen = FileLength64(ls_docname)

If li_value <> 1 Then
   Return lblb_temp
end if

li_FileNo = FileOpen(ls_docname, StreamMode!, Read!)
If li_FileNo < 0 Then Return lblb_temp
If ll_FileLen > 32765 Then
   If Mod(ll_FileLen, 32765) = 0 Then
      li_Reads = ll_FileLen / 32765
   Else
      li_Reads = (ll_FileLen / 32765) + 1
   End If
Else
   li_Reads = 1
End If

For li_i = 1 To li_Reads
   If FileRead(li_FileNo, lblb_temp) = -1 Then Return lblb_temp

   ablb_data = ablb_data + lblb_temp
Next

FileClose(li_FileNo)

string ls_id
long ll_row
dw_2.settrans(sqlca)
ll_row = dw_2.insertrow(0)
ls_id = lf_sp_get_id_num()
is_current_id = ls_id
dw_2.setitem(ll_row, 1, ls_id)
dw_2.setitem(ll_row, 2, ls_named)
dw_2.setitem(ll_row, 3, ls_docname)
dw_2.update()

messagebox('blob len', string(len(ablb_data)))

connect using sqlca;
UpdateBlob EAG_FS_FILE_TRANSFER set file_Data = :ablb_data where file_id = :is_current_id using sqlca;

if sqlca.sqlcode <> 0 then
   messagebox("connect error",sqlca.sqlerrtext)
end if

commit;

 

I then use this function to read the file back out.  And it always comes out as just the first 32k of the file.  So I am not sure if this is an issue with how I am putting the blob into the table or if this is an issue with how I am reading it out of the table.  Anyone see where I messed up?

if dw_2.getrow() > 0 then
li_selected = dw_2.getrow()
ls_file = dw_2.getitemstring(li_selected, 'file_name')
ls_id = dw_2.getitemstring(li_selected, 'file_id')

connect using sqlca;
SELECTBLOB file_data INTO :lb_file FROM EAG_FS_FILE_TRANSFER WHERE file_id = :ls_id;
messagebox('button len', string(len(lb_file)))

if not isnull(lb_file) then
   ls_open_path = 'c:\temp\blob\'
if not (fileexists(ls_open_path)) then 
   createdirectory(ls_open_path)
end if
ls_new_file = ls_open_path + ls_file

if not isnull(lb_file) then
    ll_FileLen = Len(lb_file)
    messagebox('len', string(ll_filelen))
    li_File = FileOpen(ls_new_file, StreamMode!, Write!, LockWrite!, Append!)
    If li_File > 0 Then
       // Determine how many times to call FileWrite
       If ll_FileLen > 32765 Then
       If Mod(ll_FileLen, 32765) = 0 Then
       ll_Loops = ll_FileLen/32765
   Else
       ll_Loops = (ll_FileLen/32765) + 1
   End If
Else
   ll_Loops = 1
End If


For ll_Step = 1 To ll_Loops
   FileWrite(li_File,BlobMid(lb_file,((ll_Step - 1)*32765) + 1, 32765))
Next
else
   messagebox('not found','')
end if

end if
else
   messagebox('Invalid Row', 'select a valid attachment to view')
end if

 

John Fauss Accepted Answer Pending Moderation
  1. Thursday, 26 May 2022 15:01 PM UTC
  2. PowerBuilder
  3. # 1

What DBMS and PB version/release/build are you using, Rick?

Maybe try using FileReadEx & FileWriteEx PowerScript functions to read/write the entire file with one call instead of dealing with 32K segments?

Comment
  1. Roland Smith
  2. Thursday, 26 May 2022 16:33 PM UTC
I think this is the likely issue. The PBMaxBlobSize default is 1 MB, not 32 KB.
  1. Helpful
There are no comments made yet.
Tomas Beran Accepted Answer Pending Moderation
  1. Thursday, 26 May 2022 16:40 PM UTC
  2. PowerBuilder
  3. # 2

Hi Rick

You should check size of the file before uploading to the DB. FileLength64() returns Longlong. Not long. So be careful.

Use FilereadEx() / FiileWriteEx() instead of splitting the blob

Why do you save the file in Append! mode? Row:

    li_File = FileOpen(ls_new_file, StreamMode!, Write!, LockWrite!, Append!)

in your code.

Isn't this the issue? Not because of the 32kb limit but because your file is broken because you've mixed few previously downloaded files?

Comment
There are no comments made yet.
Chris Pollach @Appeon Accepted Answer Pending Moderation
  1. Thursday, 26 May 2022 15:15 PM UTC
  2. PowerBuilder
  3. # 3

Hi Rick;

   This could be your problem ...

https://community.appeon.com/index.php/qna/q-a/blobs-32k

Regards ... Chris

Comment
  1. Chris Pollach @Appeon
  2. Thursday, 26 May 2022 16:59 PM UTC
This ODBC setting have been used for decades in all PB releases. ;-)

You can also change this in the PBODB.ini file .... You even documented it Roland ... https://community.appeon.com/index.php/qna/q-a/use-of-pbmaxblobsize-dbparm
  1. Helpful
  1. Rick Domogalik
  2. Thursday, 26 May 2022 18:54 PM UTC
OK, so I got it to go past the 32k limit. It was a combo of updating the INI file and adding the correct parameter to the connect string.

SQLCA.DBParm = "ConnectString='DSN=DEMOdata;UID=dev;PWD=<******>;PBMaxBlobSize=1024000, PBNewSPInvocation='Yes', CallEscape='No', StripParmNames='Yes'"



Any file under 1 MB works fine. Any file larger than 1 MB but less than 10 MB Causes SQLSTATE = S1010 Function Sequence Error. and the blob is not put in the database. If I go larger than 10 MB, it crashes the IDE at runtime.



So I upped the max size, just put another 0 behind. Same result of the SQLSTATE error.



At lease I am getting closer... :)
  1. Helpful
  1. Chris Pollach @Appeon
  2. Thursday, 26 May 2022 19:20 PM UTC
Pretty simple ... PBMaxBlobSize=1000000000 // or

PBMaxBlobSize=2000000000

For example up to 1 or 2 gigs! ;-)
  1. Helpful
There are no comments made yet.
Rick Domogalik Accepted Answer Pending Moderation
  1. Thursday, 26 May 2022 15:20 PM UTC
  2. PowerBuilder
  3. # 4

using Sql Server 2017 and PB21.  I tried using the Ex flavors and did not see a difference.  I replaced all the Read and writes and still only see 32k being put in the table when I do the extract.  It still gives me the correct lengths (299669 for the file I am testing with) prior to the first save to the table.  Then when I call the extract, it says the blob length is 32K.

Still not sure if it is saving properly into the table.  

Comment
There are no comments made yet.
Rick Domogalik Accepted Answer Pending Moderation
  1. Thursday, 26 May 2022 15:47 PM UTC
  2. PowerBuilder
  3. # 5

After a few more attempts, still seeing the 32k limit.  Here is my connect string.  Could one of the other parameters be messing this up?

 

SQLCA.DBParm = "ConnectString='DSN=DemoDB;UID=dev;PWD=<******>;PBMaxBlobSize=2000000,PBNewSPInvocation='Yes',CallEscape='No', StripParmNames='Yes'"

Comment
There are no comments made yet.
John Fauss Accepted Answer Pending Moderation
  1. Thursday, 26 May 2022 19:42 PM UTC
  2. PowerBuilder
  3. # 6

Rick, what version of the Microsoft ODBC Driver for SQL Server are you using (18.0.1.1 is the newest/latest)?

Have you considered testing a native interface (MSOLEDBSQL or SQL Server Native Client) instead of using the ODBC "middleware"... if only to see if it works for you?

Comment
There are no comments made yet.
Steen Jakobsen Accepted Answer Pending Moderation
  1. Wednesday, 5 October 2022 05:28 AM UTC
  2. PowerBuilder
  3. # 7

I had the same problem  .. and nothing really worked!!

 

So I upgraded to v18 of the MS odbc (released August 8, 2022) and that worked GREAT!

Download ODBC Driver for SQL Server - ODBC Driver for SQL Server | Microsoft Learn

 

here is my connect string:

This.DBParm = "ConnectString='DSN=" + ls_dsn_name + ";UID=" + user_name + ";PWD=" + password + ";PBMaxBlobSize=2147483647, PBMaxTextSize=2147483647',Disablebind=1,DelimitIdentifier='No',CallEscape='NO',ConnectOption='SQL_DRIVER_CONNECT,SQL_DRIVER_NOPROMPT'"

 

.. Just my 2 cents feedback :-)

//Steen

Comment
  1. Steen Jakobsen
  2. Wednesday, 5 October 2022 10:15 AM UTC
Just verified at client site having huge blobs. Now 20mb in one blob is no problem :-)
  1. Helpful
There are no comments made yet.
  • Page :
  • 1


There are no replies made for this question yet.
However, you are not allowed to reply to this question.