1. Glenn Scamman
  2. PowerBuilder
  3. Tuesday, 8 August 2023 17:00 PM UTC

Hello,

I'm using PB2021 Build 1509, and we utilize ASP .Net Core 3.1 REST api's for all data access.  Those were built with SnapDevelop and .Net Datastores. It has been working relatively well until more recently when several users started trying to query larger datasets from the database.  I am aware of the data limits of 20GB and 100,000 records on the restclient methods, but I'm having issues on result sets way smaller than that.

When I use the Retrieve method, such as...

     ll_rc1 = gnv_app.inv_RestClient.Retrieve( dw_1, ls_Url, ls_Json )

I can usually get 20,000 to 30,000 records, but anything higher and the PB client application usually crashes silently with no error messages. Debugging crashes too.

If I change to using the SendPostRequest() method, such as...

     ll_rc1 = gnv_app.inv_RestClient.SendPostRequest(ls_url, ls_json, ls_response)

I can only get about 3,000 records, anything higher and it returns a -14 "Code Conversion Failed" error.  I am setting the request headers to accept-encoding of 'gzip', and only gzip, so I don't think the compression/decompression is the issue.

Now, I could be pushing the 20GB size limit, but I highly doubt it, especially with the SendPostRequest.  I only started trying to use it to see if I can start managing the return of large datasets by breaking it into smaller batches of data, using some sort of stateful design and multiple requests to the server. But only getting 3,000 records at a time is not what I was hoping for.

Does anyone have any suggestions on how to get large datasets with a single request (which would be ideal), or at least a fairly simple way to turn a retrieve into multiple smaller requests and merge the data together into a datawindow seamlessly?

Regards,

Glenn

Glenn Scamman Accepted Answer Pending Moderation
  1. Tuesday, 8 August 2023 21:07 PM UTC
  2. PowerBuilder
  3. # 1

Note: I also just tried sending the results back in several smaller chunks using 3 different json blocks, but it still threw a -14 after trying to return about 6 or 7 thousand records.  It is starting to look like multiple requests might be the only way to get back a large number of records, unless there is something sneaky causing this "code conversion failed" error, and not just the number of records or size of the data.

 

       public IDataPacker MyRetrieve(String newDwSql)
        {
            var packer = new DataPacker();
            var dataStore = new DataStore<PropertyReviewGrid>(_dataContext);
            var dataStore2 = new DataStore<PropertyReviewGrid>(_dataContext);
            var dataStore3 = new DataStore<PropertyReviewGrid>(_dataContext);
            long totalRows;
            long dwRows;
            
            //05/25/22 GOES - I've started base64 encoding the sql string to get past the Azure WAF sql-injection rules and make the API's more secure.
            byte[] data = Convert.FromBase64String(newDwSql);
            string decodedSql = Encoding.UTF8.GetString(data);
            
            dataStore.SetSqlSelect(decodedSql);
            // dataStore2.SetSqlSelect(decodedSql);
            
            dataStore.Retrieve(new object[] { });
            
            totalRows = dataStore.RowCount;
            packer.AddValue("Total Count", totalRows);
            
            if (totalRows <= 2500)
            {
                packer.AddDataStore("First 2500 Batch", dataStore);
            }
            
            if (totalRows > 2500)
            {
                var myBool = dataStore.RowsMove(2500, 4999, DwBuffer.Primary, dataStore2, 999, DwBuffer.Primary);
                dwRows = dataStore2.RowCount;
                packer.AddDataStore("First 2500 Batch", dataStore);
                packer.AddDataStore("Second 2500 Batch", dataStore2);
            }
            
            if (totalRows > 5000)
            {
                var myBool = dataStore.RowsMove(5000, 7499, DwBuffer.Primary, dataStore3, 999, DwBuffer.Primary);
                dwRows = dataStore3.RowCount;
                packer.AddDataStore("Third 2500 Batch", dataStore3);
            }
            
            packer.AddValue("Error Message", "Ok");
            
            return packer;
        }
Comment
There are no comments made yet.
Logan Liu @Appeon Accepted Answer Pending Moderation
  1. Sunday, 13 August 2023 18:02 PM UTC
  2. PowerBuilder
  3. # 2

Hi Glenn,

Your issue may be caused by the limitation of HttpClient/RestClient when processing large JSON data. So the key issue is how to reduce the size of the JSON.

I suggest that you can consider calling more times page by page.

E.g.: use RetrieveByPage in the server side. IDataStoreBases.RetrieveByPageAsync Method (appeon.com)

Extend your custom Retrieve method in RestClient to get all data into a temporary DataStore page by page, and import data from the temporary DataStore to the real DataWindow.

Regards, Logan

Comment
There are no comments made yet.
  • Page :
  • 1


There are no replies made for this question yet.
However, you are not allowed to reply to this question.