1. Matt Kluzak
  2. PowerBuilder
  3. Tuesday, 30 August 2022 13:59 PM UTC

I support a system where inspectors use a PB2019R3 app to enter data into a Postgres 12 database and then generates SQL statements for the data and saves as a text file. The inspectors use another PB2019R3 app to populate a Sybase 15.7 database with the generated SQL statements. They then update their Postgres database with all inspectors' data via another PB2019R3 app that pipes the data from Sybase to Postgres.

I recently updated two comment columns in two tables, both on the Sybase and Postgres sides, from varchar (255) to varchar(1000). I can successfully input data with up to 1000 characters to the updated columns in Postgres and then insert the generated SQL statements into Sybase. However, when trying to pipe it back to Postgres I get the error "Bind Parameter for value :9 is Too Big (2000)." The comment column is the 9th column.

I checked the data and verified it isn't over the 1000 character limit. I tried deleting out 13 of the 1000 characters, and the error this time was "Bind Parameter for value :9 is Too Big (1976)."

Could someone indicate what's wrong and a possible solution?

Thanks

 

Accepted Answer
Matt Kluzak Accepted Answer Pending Moderation
  1. Wednesday, 14 September 2022 13:17 PM UTC
  2. PowerBuilder
  3. # Permalink

I finally found the issue. In the Postgres entry in ODBC Data Source Administrator, if I configure the data source  and go to "Datasource" option, the "Max Varchar" was set to 255 by default. There is a "Max LongVarChar" value, which I figured the driver handled things as a LongVarChar if it exceeded the "Max VarChar".

Changing the "Max Varchar" from 255 to 1000 in ODBC Data Source Administrator solved my issue.

Thanks to everyone for their input, and my apologies for all the trouble for such a trivial thing.

Comment
  1. John Fauss
  2. Wednesday, 14 September 2022 13:37 PM UTC
That's great news, Matt! Thank you for letting everyone know the solution to this vexing issue.
  1. Helpful
There are no comments made yet.
Arnd Schmidt Accepted Answer Pending Moderation
  1. Wednesday, 31 August 2022 18:36 PM UTC
  2. PowerBuilder
  3. # 1

Hi Matt,

did you already checked your pipeline source code after the changes?

Via Edit Source you can quickly check if the new(!) column definitions are included.

regards

Arnd

Comment
  1. Matt Kluzak
  2. Wednesday, 31 August 2022 18:44 PM UTC
Hi Arnd,



When I made the changes I modified the pipeline via the source code. In both the standard view and the source code view I can see the column reflects my change.
  1. Helpful
There are no comments made yet.
Matt Kluzak Accepted Answer Pending Moderation
  1. Tuesday, 30 August 2022 20:59 PM UTC
  2. PowerBuilder
  3. # 2

Chris,

 

While debugging, I expanded the Postgres transaction object under the Global scope. This is what shows:

 

Comment
  1. Matt Kluzak
  2. Tuesday, 6 September 2022 12:58 PM UTC
Hi Chris,



I went to the PG community. Per developer Tom Lane, the error string I received doesn't even exist in the Postgres source code, so it isn't coming from Postgres.



Also, someone had me check the character set of the databases. The Sybase side is using



Character Set = 1, iso_1

ISO 8859-1 (Latin-1) - Western European 8-bit character set.

Sort Order = 50, bin_iso_1

Binary ordering, for the ISO 8859/1 or Latin-1 character set (

iso_1).





While the Postgres side is using UTF8.



They thought this might explain the doubling from 1000 to 2000 in the initial error message. Any thoughts on that possibility, or would that kick the ball over to Sybase/SAP?

  1. Helpful
  1. Chris Pollach @Appeon
  2. Tuesday, 6 September 2022 13:40 PM UTC
Hi Matt;

I am still perplexed that you got the PL working in the IDE vs an EXE. So there must be a difference between your development environment vs production (or, have I got the wrong impression).

Please open an Appeon Support Ticket for this issue so that we can help you further to resolve this issue.

Regards... Chris
  1. Helpful
  1. Matt Kluzak
  2. Tuesday, 6 September 2022 13:50 PM UTC
I should add a comment, that somehow my pipe got misconfigured to use the Postgres database for both for the source and destination. After correct it, it no longer works in the IDE.



I'll submit a ticket.
  1. Helpful
There are no comments made yet.
Chris Pollach @Appeon Accepted Answer Pending Moderation
  1. Tuesday, 30 August 2022 14:18 PM UTC
  2. PowerBuilder
  3. # 3

Hi Matt;

  Just a thought ... Can you try ...

SQLCA.DBParm = "DisableBind=1"

Regards ... Chris

Comment
  1. Chris Pollach @Appeon
  2. Tuesday, 30 August 2022 20:29 PM UTC
Yes, if your PB App uses an INI file in production connect to Postgres, then just add the "DisableBind=1" into the INI file for the PG's DBParm.
  1. Helpful
  1. Matt Kluzak
  2. Tuesday, 30 August 2022 20:40 PM UTC
It doesn't seem to be working. The string we currently use is (edited out the actual credentials):



DBParm=ConnectString=DSN=WM_LT_User;UID=user;PWD=password',DelimitIdentifier='Yes'



I tried:

DBParm=ConnectString=DSN=WM_LT_User;UID=user;PWD=password',DelimitIdentifier='Yes', DisableBind=1



But that doesn't work etiher.

  1. Helpful
  1. Chris Pollach @Appeon
  2. Tuesday, 30 August 2022 20:49 PM UTC
That looks correct to me.

You might have try this setting in your INI on your development PC - where you can utilize the IDE's debugger. to double-check everything is OK in the DBParm field of your .PG's Transaction Object.
  1. Helpful
There are no comments made yet.
  • Page :
  • 1


There are no replies made for this question yet.
However, you are not allowed to reply to this question.