1. Sivaprakash BKR
  2. PowerBuilder
  3. Tuesday, 1 November 2022 10:49 AM UTC

Hello,

Using Powerbuilder 2019 R3, SQLite, ODBC Driver from http://www.ch-werner.de/sqliteodbc/

Need to cast a computed column as date, but PB interprets it as a char column only.  I tried

Datawindow's column specification, we get

All those computed fields are interpreted as char(255).   Any idea why?  

Same issue for numeric columns.

Anything that I can do to make it to interpret it correctly?

Happiness Always
BKR Sivaprakash

 

Armeen Mazda @Appeon Accepted Answer Pending Moderation
  1. Thursday, 3 November 2022 13:22 PM UTC
  2. PowerBuilder
  3. # 1

If looking for a free database, why not use PostgreSQL that is officially supported by PB?

Comment
  1. Sivaprakash BKR
  2. Friday, 4 November 2022 05:08 AM UTC
Thanks Armeen,

We use PostGreSQL now. In addition to that we plan to support SQL Server and SQLite.
  1. Helpful
There are no comments made yet.
Roland Smith Accepted Answer Pending Moderation
  1. Wednesday, 2 November 2022 17:33 PM UTC
  2. PowerBuilder
  3. # 2

I did some testing where I added a view to a SQLite database that used the date and time functions to return parts from a datetime column. I used the driver from http://www.ch-werner.de/sqliteodbc/ which was last updated June 20, 2020.

I then ran my ODBC API example against it.

The column type id is reported as -9 (SQL_WVARCHAR).

The column type name is reported as an empty string.

 

Comment
There are no comments made yet.
Chris Pollach @Appeon Accepted Answer Pending Moderation
  1. Wednesday, 2 November 2022 16:46 PM UTC
  2. PowerBuilder
  3. # 3

Hi Miguel;

  You are correct ... SQLite has never been a supported DBMS in either a PS/Sybase/SAP/Appeon versions of PB regardless of the driver (ODBC or JDBC). Thus, while you can try to use these access methods (JDBC coming back in PB 2022 MR#1) - they are very problematic from my varied experiences. Basically, "use at own risk" - especially for production heavy DML applications. IMHO

   If you look in the PBODB.INI file, you will not see any reference to SQLite either. The same for JDBC. In these cases, note that ODBC and JDBC are "middle-ware" and thus DBMS interaction is at "arms length" from PB and any compiled PB App. That also means that PB DB interfaces generate JDBC or ODBC SQL - not the actual DML of the target DBMS. The middle-ware then converts this SQL to the DBMS specific SQL (in this case SQLite). In the return trip, the SQLite "result set" is converted back to an ODBC or JDBC result set and that is what the PB ODBC/JDBS interface sees for both outbound and inbound DML operations. The middle-ware layer then in its DML conversions back & forth can easily misinterpret things at the expense of the PB App's execution. However, a native SQLite DB driver would be different as it would deal with the SQLite DBMS directly avoiding these DML conversions / misinterpretations.

    The JDBC & ODBC DB drivers too are often the culprit because they are often supplied by third party or Open Source software vendors. So again the "luck" of using these with a non-vendor supplied / supported DB client driver is often the cause of many headaches. I have seen this countless times before.

    Anyway, that's just my take on SQLite from my many failed experiences over a few decades to make this DBMS work in a more complex DBMS schema based application.  However, it maybe OK for very small "simplex" DBMS use. Of course personally, I would go with SQLServer Express even for those simplex DBMS applications. Just my $0.02.

HTH

Regards ... Chris

 

Comment
  1. Miguel Leeuwe
  2. Thursday, 3 November 2022 02:20 AM UTC
Thanks for that Chris,

Maybe we are just lucky with JDBC and Tibero DB, but ODBC did not solve anything (like you say, only very basic stuff). JDBC however works absolutely great.

  1. Helpful
There are no comments made yet.
Sivaprakash BKR Accepted Answer Pending Moderation
  1. Wednesday, 2 November 2022 06:23 AM UTC
  2. PowerBuilder
  3. # 4

Thanks John Fauss,

I slightly modified the query and added two more fields:
***********************************************
The query is

ls_sqlstatement = "Select 'Test Value' as field1, " + &
"cast('2022-11-01' as date) as field2, " + &
"cast('2022-11-01 09:36:00' as datetime) as field3, " + &
"cast(null as time) as field4, " + &
"cast(1.2345 as float) as field5, " + &
"'2022-11-01 09:35:00' as field6, " + &
"3.444 as field7 " + &
"From Departments "
Here is the result:
Field#01: Type = String    Value: Test Value
Field#02: Type = Long      Value: 2022
Field#03: Type = Long      Value: 2022
Field#04: Type = String    Value: (null)
Field#05: Type = Double   Value: 1.2345
Field#06: Type = String     Value: 2022-11-01 09:35:00
Field#07: Type = Double    Value: 3.444

Btw, my reported issue is NOT there when we select a field from a database table, but the issue is there only when we create computed column(s).   All columns' datatype is char(255) irrespective of whatever we try to cast.   

Without getting actual datatype(s) for computed columns, it might be hard to validate the input(s) for those columns.  

 

Comment
There are no comments made yet.
Roland Smith Accepted Answer Pending Moderation
  1. Tuesday, 1 November 2022 21:01 PM UTC
  2. PowerBuilder
  3. # 5

I wonder if adding a section to pbodb.ini would help.

Comment
There are no comments made yet.
John Fauss Accepted Answer Pending Moderation
  1. Tuesday, 1 November 2022 15:59 PM UTC
  2. PowerBuilder
  3. # 6

Here's some code you can use to determine what PB is "seeing" from the ODBC (or any other) DB driver.

Replace the supplied SELECT statement assignment at the beginning with a SELECT statement that is valid for your DBMS:

Integer li_response
Long    ll_rc, ll_row, ll_index
String  ls_sqlstatement, ls_datatype, ls_value, ls_name, ls_msg

// Define the SQL SELECT statement.

// Note: The supplied SELECT statement is valid for SQL Server... 
//       Code a SELECT statement that is appropriate for your DBMS.
ls_sqlstatement = "Select 'Test Value' as field1, " + &
   "cast('2022-11-01' as date) as field2, " + &
   "cast('2022-11-01 09:36:00' as datetime) as field3, " + &
   "cast(null as time) as field4, " + &
   "cast(1.2345 as float) as field5 " + &
   "From (Select a=1) t"

// Declare a cursor that will examine each row.
DECLARE c_values_cursor	DYNAMIC CURSOR FOR SQLSA;

// Obtain information about the dynamic SQL statement.
PREPARE SQLSA FROM :ls_sqlstatement USING SQLCA;
DESCRIBE SQLSA INTO SQLDA;

// Execute the SELECT statement by opening the cursor.
OPEN DYNAMIC c_values_cursor USING DESCRIPTOR SQLDA;

// Fetch the first row of the result set.
FETCH c_values_cursor USING DESCRIPTOR SQLDA;
ll_rc  = SQLCA.SQLCode
ll_row = 0

// Obtain the descriptor and value for each column in the result set.
Do While SQLCA.SQLCode = 0
   ll_row ++
   ls_msg = ""
   
   For ll_index = 1 To SQLDA.NumOutputs
      Choose Case SQLDA.OutParmType[ll_index]
         Case TypeBoolean!
            ls_datatype = "Boolean"
            ls_value    = String(GetDynamicNumber(SQLDA,ll_index))
            If IsNull(ls_value) Then
               ls_value = "(null)"
            ElseIf ls_value = "0" Then
               ls_value = "False"
            Else
               ls_value = "True"
            End If
         Case TypeByte!
            ls_datatype = "Byte"
            ls_value    = String(GetDynamicNumber(SQLDA,ll_index))
            If IsNull(ls_value) Then ls_value = "(null)"
         Case TypeDate!
            ls_datatype = "Date"
            ls_value    = String(GetDynamicDate(SQLDA,ll_index))
            If IsNull(ls_value) Then ls_value = "(null)"
         Case TypeDateTime!
            ls_datatype = "Datetime"
            ls_value    = String(GetDynamicDateTime(SQLDA,ll_index))
            If IsNull(ls_value) Then ls_value = "(null)"
         Case TypeDecimal!
            ls_datatype = "Decimal"
            ls_value    = String(GetDynamicNumber(SQLDA,ll_index))
            If IsNull(ls_value) Then ls_value = "(null)"
         Case TypeDouble!
            ls_datatype = "Double"
            ls_value    = String(GetDynamicNumber(SQLDA,ll_index))
            If IsNull(ls_value) Then ls_value = "(null)"
         Case TypeInteger!
            ls_datatype = "Integer"
            ls_value    = String(GetDynamicNumber(SQLDA,ll_index))
            If IsNull(ls_value) Then ls_value = "(null)"
         Case TypeLong!
            ls_datatype = "Long"
            ls_value    = String(GetDynamicNumber(SQLDA,ll_index))
            If IsNull(ls_value) Then ls_value = "(null)"
         Case TypeLongLong!
            ls_datatype = "LongLong"
            ls_value    = String(GetDynamicNumber(SQLDA,ll_index))
            If IsNull(ls_value) Then ls_value = "(null)"
         Case TypeReal!
            ls_datatype = "Real"
            ls_value    = String(GetDynamicNumber(SQLDA,ll_index))
            If IsNull(ls_value) Then ls_value = "(null)"
         Case TypeString!
            ls_datatype = "String"
            ls_value    = GetDynamicString(SQLDA,ll_index)
            If IsNull(ls_value) Then ls_value = "(null)"
         Case TypeTime!
            ls_datatype = "Time"
            ls_value    = String(GetDynamicTime(SQLDA,ll_index))
            If IsNull(ls_value) Then ls_value = "(null)"
         Case TypeUInt!
            ls_datatype = "UInt"
            ls_value    = String(GetDynamicNumber(SQLDA,ll_index))
            If IsNull(ls_value) Then ls_value = "(null)"
         Case TypeULong!
            ls_datatype = "ULong"
            ls_value    = String(GetDynamicNumber(SQLDA,ll_index))
            If IsNull(ls_value) Then ls_value = "(null)"
         Case Else
            ls_datatype = "Unknown"
            ls_value    = "????"
      End Choose
      
      If ls_msg <> "" Then ls_msg += "~r~n"
      ls_msg += "Field #" + String(ll_index,"00") + ":  Type = " + &
                ls_datatype + "~tValue: " + ls_value
   Next
   
   ls_msg += "~r~n~r~nContinue with the next row?"
   
   li_response = MessageBox("Row Number " + String(ll_row), &
                            ls_msg,Question!,YesNoCancel!,1)
   
   If li_response = 1 Then
   FETCH c_values_cursor USING DESCRIPTOR SQLDA;
      ll_rc = SQLCA.SQLCode
   Else
      // Exit the Fetch loop.
      SQLCA.SQLCode = 999
   End If
Loop

// Close the cursor before leaving.
Close c_values_cursor;

Return ll_rc
Comment
  1. Miguel Leeuwe
  2. Tuesday, 1 November 2022 16:10 PM UTC
Wow!

:)

  1. Helpful
There are no comments made yet.
Miguel Leeuwe Accepted Answer Pending Moderation
  1. Tuesday, 1 November 2022 15:40 PM UTC
  2. PowerBuilder
  3. # 7

Appeon does not support ODBC SQLLite. For this very reason you'll find lots an lots of problems.

If you're not using PB2022, you might have better luck using a JDBC driver for SQLLite.

I think you are on PB2019, so it's worth giving it a try.

(I've been told that Appeon will bring back JDBC connectivity somewhere at the end of next year, but for now PB2022 no longer supports it.)

regards.

Comment
  1. Armeen Mazda @Appeon
  2. Wednesday, 2 November 2022 21:53 PM UTC
Hi Miguel, R2 will have it as well. But team has decided to roll it out now so customers can try it and we can make any adjustments for R2 if needed. So please try, and if issues open support ticket. Thanks!
  1. Helpful 1
  1. Miguel Leeuwe
  2. Thursday, 3 November 2022 02:28 AM UTC
Thanks Armeen,

Meanwhile I found the email that we got from Julie and now remember why I concluded it would be back in R2:

"

Point 1 is the immediate action and can be implemented in the coming MR. But in this MR, it won’t be possible to build JDBC database connection in the IDE > Database Profile, so we suggest that if you run the application from IDE, you have to use the ODBC database connection as a workaround. And at the same time, you can set up JDBC database connection in scripts for runtime.



Point 2 is the action for the 2nd phase. If you want to have JDBC database connection in the IDE for the development stage, you will need to wait till the 2nd phase (next year, 2022 R2).

"



The reason why I concluded that it wouldn't be available until R2, is that if it's true what Julie says: "you cannot connect when running from the IDE", that would mean we cannot do any debugging or normal runs from the IDE. For every change we'd have to build an executable to be able to test it and for debugging we'd have to introduce Messageboxes. Appeon's suggestion of using ODBC for that is a very bad one, since nothing works well with ODBC for us.

So I guess we're going to wait.

I WILL download and install 2022 on a VM though and see if things at least seem to be working well when running an executable.

regards.
  1. Helpful
  1. Miguel Leeuwe
  2. Thursday, 3 November 2022 02:34 AM UTC
Ad point 1: if the limitation is only not being able to create a database profile than that's NOt a problem, but it states: "... if you RUN the application from the IDE ..."
  1. Helpful
There are no comments made yet.
Andreas Mykonios Accepted Answer Pending Moderation
  1. Tuesday, 1 November 2022 14:37 PM UTC
  2. PowerBuilder
  3. # 8

Hi.

Also not having great experience in sqlite. I want to add to John's answer the following info from sqlite documentation (Datatypes In SQLite):

2.2. Date and Time Datatype

SQLite does not have a storage class set aside for storing dates and/or times. Instead, the built-in Date And Time Functions of SQLite are capable of storing dates and times as TEXT, REAL, or INTEGER values:

  • TEXT as ISO8601 strings ("YYYY-MM-DD HH:MM:SS.SSS").
  • REAL as Julian day numbers, the number of days since noon in Greenwich on November 24, 4714 B.C. according to the proleptic Gregorian calendar.
  • INTEGER as Unix Time, the number of seconds since 1970-01-01 00:00:00 UTC.

Applications can choose to store dates and times in any of these formats and freely convert between formats using the built-in date and time functions.

As for computed columns, I see the following (Generated Columns (sqlite.org)):

1. Introduction

Generated columns (also sometimes called "computed columns") are columns of a table whose values are a function of other columns in the same row. Generated columns can be read, but their values can not be directly written. The only way to change the value of a generated column is to modify the values of the other columns used to calculate the generated column.

I guess that powerbuilder isn't aware of the way sqlite internally handles computed columns. But I may be wrong.

Andreas.

Comment
There are no comments made yet.
John Fauss Accepted Answer Pending Moderation
  1. Tuesday, 1 November 2022 13:42 PM UTC
  2. PowerBuilder
  3. # 9

When PB asks the database "under the covers" for a description of the query's result set, it appears that the ODBC driver is telling PB that the column(s) in question are a string.

Purely a guess on my part, since I don't use SQLite or ODBC, but I suspect it may be due to the antiquated ODBC driver. A quick look at the "online documentation" provided at the URL you supplied contained this (emphasis mine):

Restrictions of data type mapping:

- Integer and floating point columns in the database are reported
  as NULLs when no digit seen in the column, otherwise all digits
  up to end of string or non-digit are interpreted as the value,
  i.e. '10blurk' is ten, '0blurk' is zero, but 'blurk' is NULL.
- Format for SQL_DATE is YYYY-MM-DD or YYYYMMDD
- Format for SQL_TIME is hh:mm:ss or hhmmss
- Format for SQL_TIMESTAMP is
      YYYYMMDDhhmmss[fraction]
  or  YYYY-MM-DD hh:mm:ss[.fraction]
  or  hh:mm:ss[.fraction] YYYY-MM-DD
  The fractional part is expressed as 1E-09 seconds
- The driver puts the ODBC string representations for date/time,
  (eg for "{ts '2001-10-10 12:58:00'}" the substring within the
  single quotes) directly into the SQLite column

Also, your question states you are trying to obtain a date, yet the SQL snippet shows the CAST datatype as datetime.

The documentation appears to be from around 2001 based solely on my very quick glance, so I cannot help but wonder if you should try using an alternative ODBC driver for SQLite that is newer. Perhaps, however, it is only the documentation and web site that appear to be so greatly out of date.

Best regards, John

Comment
There are no comments made yet.
  • Page :
  • 1


There are no replies made for this question yet.
However, you are not allowed to reply to this question.
We use cookies which are necessary for the proper functioning of our websites. We also use cookies to analyze our traffic, improve your experience and provide social media features. If you continue to use this site, you consent to our use of cookies.