1. Roland Smith
  2. PowerBuilder
  3. Thursday, 16 November 2017 16:39 PM UTC

I have an app that works with different databases (SQL Server, ASE, SQL Anywhere, Oracle) and one of the tables has an blob column.

I have a need to duplicate a bunch of rows in the table. What would the best way of duplicating the blob data be?

Do I need to loop through all the rows individually like this:

select the normal columns

selectblob the blob column

insert the normal columns

updateblob the blob column

Or is there a simpler way? It must be generic enough to work on different database types.

 

Brad Mettee Accepted Answer Pending Moderation
  1. Thursday, 16 November 2017 19:31 PM UTC
  2. PowerBuilder
  3. # 1

Not sure if this works in all databases, but I just tried it in ASA8 and it's fine. It should work in all databases as it's just a slightly different way of inserting rows.

insert into tblname (col1, col2, row_key, blobdata)
  select col1, col2, :new_row_key, blobdata 
  from tblname
  ​where row_key = :old_row_key
using sqlca;
// add code to check sqlca for success here
commit using sqlca;

Basically, let the database engine do all the work of duplicating the actual data, and not bring it back to PB at all.

 

Comment
There are no comments made yet.
Roland Smith Accepted Answer Pending Moderation
  1. Thursday, 16 November 2017 19:23 PM UTC
  2. PowerBuilder
  3. # 2

Thanks for the ideas.

I thought about it some more and I think I will probably be changing the primary key so that I don't need to have multiple copies. Poor database design by 2005 me.

 

Comment
  1. Brad Mettee
  2. Thursday, 16 November 2017 19:35 PM UTC
Congrats on a clean solution Roland.



Removing the need for duplicate data is always a good thing.

  1. Helpful
There are no comments made yet.
Mike S Accepted Answer Pending Moderation
  1. Thursday, 16 November 2017 18:01 PM UTC
  2. PowerBuilder
  3. # 3

i believe that is the fastest way, other than writing multiple stored procedures.

The size of the blob matters too - so you may want to check the size of the blob before retrieval if you have any larger blobs and then retrieve/update chunks at a time.

the select and update for blob are the same across all 4 databases, but getting the size is different in oracle:  (datalength vs dbms_lob.getlength).

if you use odbc, then you may have size limitations even smaller (8K) than other connection types (at least on sql server) .

Comment
There are no comments made yet.
Miguel Leeuwe Accepted Answer Pending Moderation
  1. Thursday, 16 November 2017 17:20 PM UTC
  2. PowerBuilder
  3. # 4

Hi Roland,

I suppose you wouldn't want to use a stored procedure, which would be the fastest way?

I'm afraid that for the Blob fields you would have to do a selectblob + updateblob for every row then ... (unless the blobs have little data, I think less than 32Kb, then you could just use the datawindow, depending also of your database driver).

For duplication of the rest of the rows their data (before the blob stuff), I'd think of retrieving the data with a atastore / dw  and then copy the "Data" property of that retrieved dw to another, set the SetItemStatus to newModified! (make sure the primary keys are ok, or have a trigger that takes care of it).

Hope it gives you an idea.

Comment
There are no comments made yet.
  • Page :
  • 1


There are no replies made for this question yet.
However, you are not allowed to reply to this question.
We use cookies which are necessary for the proper functioning of our websites. We also use cookies to analyze our traffic, improve your experience and provide social media features. If you continue to use this site, you consent to our use of cookies.