Hello,
I'm using PB2021 Build 1509, and we utilize ASP .Net Core 3.1 REST api's for all data access. Those were built with SnapDevelop and .Net Datastores. It has been working relatively well until more recently when several users started trying to query larger datasets from the database. I am aware of the data limits of 20GB and 100,000 records on the restclient methods, but I'm having issues on result sets way smaller than that.
When I use the Retrieve method, such as...
ll_rc1 = gnv_app.inv_RestClient.Retrieve( dw_1, ls_Url, ls_Json )
I can usually get 20,000 to 30,000 records, but anything higher and the PB client application usually crashes silently with no error messages. Debugging crashes too.
If I change to using the SendPostRequest() method, such as...
ll_rc1 = gnv_app.inv_RestClient.SendPostRequest(ls_url, ls_json, ls_response)
I can only get about 3,000 records, anything higher and it returns a -14 "Code Conversion Failed" error. I am setting the request headers to accept-encoding of 'gzip', and only gzip, so I don't think the compression/decompression is the issue.
Now, I could be pushing the 20GB size limit, but I highly doubt it, especially with the SendPostRequest. I only started trying to use it to see if I can start managing the return of large datasets by breaking it into smaller batches of data, using some sort of stateful design and multiple requests to the server. But only getting 3,000 records at a time is not what I was hoping for.
Does anyone have any suggestions on how to get large datasets with a single request (which would be ideal), or at least a fairly simple way to turn a retrieve into multiple smaller requests and merge the data together into a datawindow seamlessly?
Regards,
Glenn