1. Daryl Foster
  2. PowerBuilder
  3. Thursday, 17 October 2019 07:12 AM UTC

Hi Everyone,

I've got a question about how people are creating complex JSON for real world applications.  I'm updating an application to use a third party's REST API and some of the API calls require reasonably complex JSON to be sent. The JSON may have embedded objects and/or arrays. An example is something similar to:

 

{
  "yourReference": "123456",
  "amendments": {
    "drawings": [
      {
        "identifier": "86073b67-60b1-4495-9001-7ea650b7e63e",
        "scope": "REPLACE"
      }
    ],
    "tableOfContents": [
      {
        "identifier": "86073b67-60b1-4495-9001-7ea650b7e63e",
        "scope": "CLEAN"
      }
    ],
    "index": [
      {
        "identifier": "86073b67-60b1-4495-9001-7ea650b7e63e",
        "scope": "REPLACE"
      }
    ],
    "description": [
      {
        "identifier": "86073b67-60b1-4495-9001-7ea650b7e63e",
        "scope": "MARKED_UP"
      },
      {
        "identifier": "86073b67-60b1-4495-9001-7ea650b7e63e",
        "scope": "CLEAN"
      }
    ],
    "abstract": [
      {
        "identifier": "86073b67-60b1-4495-9001-7ea650b7e63e",
        "scope": "MARKED_UP"
      }
    ]
  },
  "summaryDocument": {
    "identifier": "86073b67-60b1-4495-9001-7ea650b7e63e"
  },
  "submittedWithPayment": false
}

 

For the calls that require simple JSON I've been using either the export function from a Datastore or the MSSQL JSON functions to generate it.  For something more complex I'm looking at using JSONGenerator.  A simple static example to generate the JSON above is:

 

JSONGenerator lnv_JsonGenerator
integer li_root
integer li_amendments
integer li_summary
integer li_array
integer li_document
string ls_json

lnv_JsonGenerator  = create JSONGenerator

li_root = lnv_JsonGenerator.CreateJsonObject()

lnv_JsonGenerator.AddItemString(li_root, "yourReference", "123456")
li_amendments = lnv_JsonGenerator.AddItemObject(li_root, "amendments")

li_array = lnv_JsonGenerator.AddItemArray(li_amendments, "drawings")
li_document = lnv_JsonGenerator.AddItemObject(li_array)
lnv_JsonGenerator.AddItemString(li_document, "identifier", "86073b67-60b1-4495-9001-7ea650b7e63e")
lnv_JsonGenerator.AddItemString(li_document, "scope", "REPLACE")

li_array = lnv_JsonGenerator .AddItemArray(li_amendments, "tableOfContents")
li_document = lnv_JsonGenerator .AddItemObject(li_array)
lnv_JsonGenerator.AddItemString(li_document, "identifier", "86073b67-60b1-4495-9001-7ea650b7e63e")
lnv_JsonGenerator.AddItemString(li_document, "scope", "CLEAN")

li_array = lnv_JsonGenerator.AddItemArray(li_amendments, "index")
li_document = lnv_JsonGenerator.AddItemObject(li_array)
lnv_JsonGenerator.AddItemString(li_document, "identifier", "86073b67-60b1-4495-9001-7ea650b7e63e")
lnv_JsonGenerator.AddItemString(li_document, "scope", "REPLACE")

li_array = lnv_JsonGenerator.AddItemArray(li_amendments, "description")
li_document = lnv_JsonGenerator.AddItemObject(li_array)
lnv_JsonGenerator.AddItemString(li_document, "identifier", "86073b67-60b1-4495-9001-7ea650b7e63e")
lnv_JsonGenerator.AddItemString(li_document, "scope", "MARKED_UP")
li_document = lnv_JsonGenerator.AddItemObject(li_array)
lnv_JsonGenerator.AddItemString(li_document, "identifier", "86073b67-60b1-4495-9001-7ea650b7e63e")
lnv_JsonGenerator.AddItemString(li_document, "scope", "CLEAN")

li_array = lnv_JsonGenerator.AddItemArray(li_amendments, "abstract")
li_document = lnv_JsonGenerator.AddItemObject(li_array)
lnv_JsonGenerator.AddItemString(li_document, "identifier", "86073b67-60b1-4495-9001-7ea650b7e63e")
lnv_JsonGenerator.AddItemString(li_document, "scope", "MARKED_UP")

li_summary = lnv_JsonGenerator.AddItemObject(li_root, "summaryDocument")
lnv_JsonGenerator.AddItemString(li_summary, "identifier", "86073b67-60b1-4495-9001-7ea650b7e63e")
lnv_JsonGenerator.AddItemBoolean(li_root, "submittedWithPayment", false)

ls_json = lnv_JsonGenerator.GetJsonString()


I know I can tidy that code up and maybe use loops or functions to create the document arrays, but is this the general method for creating complex JSON.  Does anyone have any simpler suggestions?  There will be quite a few different API calls I will need to make and many of them have complex JSON that will need to be generated so I want to make sure I'm on the right track before progressing too far. Obviously the example above is just using static data, the application will be getting the required data from a database, probably via one or more datastores.

 

Any suggestions would be welcome.

Ken Guo @Appeon Accepted Answer Pending Moderation
  1. Friday, 18 October 2019 05:29 AM UTC
  2. PowerBuilder
  3. # 1

Hi Daryl,

 

Currently, we can only use code like yours to implement complex JSON.

As far as I know, a PB customer is writing a JSON NVO, it has encapsulated some functionalities of JSON Parse. It will be more convenience to use JSON and is more powerful.

Once he finishes it, I will share it with you.

 

 Regards,

Ken

 

Comment
  1. Daryl Foster
  2. Sunday, 20 October 2019 23:26 PM UTC
Thanks Ken
  1. Helpful
  1. Ken Guo @Appeon
  2. Tuesday, 22 October 2019 02:19 AM UTC
Hi Daryl,



A PB customer has finished developing JSON NVO and he has uploaded it to GitHub. Please download it and try.

As for this JSON NVO issue, you can report it to GitHub.

Code:

https://github.com/informaticon/inc.win.base.pb-json



Description:

https://github.com/informaticon/inc.win.base.pb-json/wiki



Regards,

Ken

  1. Helpful
There are no comments made yet.
Daryl Foster Accepted Answer Pending Moderation
  1. Friday, 18 October 2019 05:06 AM UTC
  2. PowerBuilder
  3. # 2

Thanks Michael,

 

The import of the JSON is fine.  Most of the JSON returned from these APIs is pretty straight forward and can be imported directly into a datastore.   My issue is generating complex JSON with nested objects and arrays to send to the APIs.

I agree we had similar issues with XML, but the way datawindows generate XML seems to be much more sophisticated and flexible than the way they generate JSON.  The XML templates would allow you to configure the output to include or exclude columns, it would allow you to add some logic into the XML generation and it would also allow exporting the XML of nested reports to make for some quite complex XML documents.  I see reference in the ExportJson documentation to exporting the data from DataWindow children, but I can't seem to get that to work.  Another issue is that ExportJson always seems to export an array.  And I can't see a way to create boolean values in the exported JSON.

I could create multiple datastores for the various parts of my data and export individual pieces of JSON for the various objects and arrays and try to stitch them together, but that seems like more work than just using the JSONGenerator directly.

I'm curious if anyone else is generating complex JSON, or whether most uses in the real world are either simple or specific to the datawindow JSON exchange.

Comment
There are no comments made yet.
Michael Kramer Accepted Answer Pending Moderation
  1. Thursday, 17 October 2019 11:20 AM UTC
  2. PowerBuilder
  3. # 3

Sorry, but it really depends!

I would prefer import/export JSON via DataStores (read: DataWindow technology) to reduce amount of explicit coding.. However ==> DataWindow is based on table-structured result sets = arrays of similar structured objects. Sometimes you need to preprocess the arriving JSON before the format fits DataWindow's ImportJSON.

If the inner data structure in JSON varies then DataWindow technology doesn't really fit. You will need significant pre/post processing -- or code explicit interpretation of received JSON.

Back in PB 9 the new features were exception handling and XML processing. We had EXACTLY the same types of challenges for XML and you do now with JSON. JSON is simply different format  albeit challenges so, so same.

What I did back in PB 9 - PB 11.5 where I did A LOT of complex XML import/export ==>

  1. Analyze import/export formats. Typically each package in a complex data stream has multiple data levels nested into each other
  2. When nested data structure fits existing DataWindow Import ==> Simply use dw.Import
  3. When each level in the nested data structure fits a separate table layout ==> Simply import each level into separate DataStore.
  4. At post processing regard the set of DataStores as a combined imported dataset.
  5. Similar but reverse order for Export.

I often experienced (like your example that the nested data structure is the same for multiple upper-level groups. That just means you can reuse the DataWindow object and its related NVOs so total code size reduces s8ignificantly.

Best advice from years of experience: Massage your import/export streams to leverage from standard JSON/MXL functionality as is practically possible. That will take short time to develop and it will run faster than explicitly coding all import/export. Look at the massaging code as in/out filters in audio/video or adapters in OO design patterns.

HTH /Michael

 

 

Comment
There are no comments made yet.
  • Page :
  • 1


There are no replies made for this question yet.
However, you are not allowed to reply to this question.