1. Mark Jones
  2. PowerBuilder
  3. Tuesday, 2 March 2021 22:36 PM UTC

I have been playing around with the Apache e-charts implementation provided with PB 2019 R3.   I have found it to be very slow when dealing with large amounts of data.   A key factor in this is the use of getitem to extract data from datawindows to build the e-chart dataset.   

I am developing an alternate solution where I grab a copy of the datawindow that is the data source, make all objects on the datawindow invisible then dynamically build a visible computed field at x=1,y=1 , height =64 that evaluates to a string containing a dataset row for the chart I am making.   I also force all bands to 0 height other than the detail band which I force to height 100.   I then save the datawindow as ascii (the only data that is saved is the value of my computed field)  and I now have a file that contains a formatted dataset.  I am actually developing all this in PB12.5 right now so it would work in any version.  I can create the dataset for 3 columns and 18500 rows in less than a second using this teechnique.

The nastiest aspect of this is dynamically building the computed field expression and handling different data types etc. 

This function will m

of_setup_dw(datawindow adw_ref,as_data_columns[])

string syntax
syntax=adw_ref.object.datawindow.syntax

string ls_tag
string ls_object_names[],ls_object_types[],ls_object_type_filters[],ls_coltypes[]
long i

//Get all the objects and make them invisible
gd.inv_dw_util.of_get_dwobject_array(adw_ref,ls_object_names,ls_object_types,ls_object_type_filters)


string ls_export_dataset_expression
for i=1 to upperbound(ls_object_names)
adw_ref.modify(ls_object_names[i]+'.visible="0"')
next

//Create the computed field for the data columns
string rtns
string ls_compute
ls_compute='create compute(band=detail alignment="0" expression="~~"x~~"" border="0" color="0" x="1" y="1" height="64" width="174" format="[GENERAL]" html.valueishtml="0" name=c_export_dataset visible="1" font.face="Arial" font.height="-10" font.weight="400" font.family="2" font.pitch="2" font.charset="0" background.mode="2" background.color="16777215" background.transparency="0" background.gradient.color="8421504" background.gradient.transparency="0" background.gradient.angle="0" background.brushmode="0" background.gradient.repetition.mode="0" background.gradient.repetition.count="0" background.gradient.repetition.length="100" background.gradient.focus="0" background.gradient.scale="100" background.gradient.spread="100" tooltip.backcolor="134217752" tooltip.delay.initial="0" tooltip.delay.visible="32000" tooltip.enabled="0" tooltip.hasclosebutton="0" tooltip.icon="0" tooltip.isbubble="0" tooltip.maxwidth="0" tooltip.textcolor="134217751" tooltip.transparency="0" transparency="0" )'
adw_ref.modify(ls_compute)

adw_ref.modify("detail.height=100")
adw_ref.modify("header.height=0")
adw_ref.modify("summary.height=0")
adw_ref.modify("footer.height=0")

string ls_expr
ls_expr='~~"[~~"'
//Now build the expression
for i=1 to upperbound(as_data_columns)
ls_expr+='+'

string ls_coltype,ls_object_type
string ls_val_expr

ls_object_type=adw_ref.describe(as_data_columns[i]+".type")
ls_coltype=adw_ref.describe(as_data_columns[i]+".coltype")
ls_val_expr=""

if ls_object_type="compute" then
long ll_coltype
//See if we have a hint on how to handle the data type of the computed field

//We may have a datetime we want to force to date.
ls_tag=lower(adw_ref.describe(as_data_columns[i]+".tag"))

ll_coltype=pos(ls_tag,"coltype(")

if ll_coltype > 0 then
ls_coltype=trim(gd.inv_str.of_parse_string(mid(ls_tag,ll_coltype+8),1,")"))
end if
end if

if ls_coltype='date' then
ls_val_expr=' ~~"~'~~"+string('+as_data_columns[i]+',~'yyyymmdd~')+~~"~'~~" '

elseif ls_coltype='time' then
ls_val_expr=' ~~"~'~~"+string('+as_data_columns[i]+',~'hh:mm:ss~')+~~"~'~~" '

elseif ls_coltype='datetime' or ls_coltype='timestamp' then
ls_val_expr=' ~~"~'~~"+string('+as_data_columns[i]+',~'yyyymmdd hh:mm:ss~')+~~"~'~~" '

elseif gd.inv_dw_util.of_is_number_datatype(ls_coltype) then
ls_val_expr='string('+as_data_columns[i]+')'

else
ls_val_expr=' ~~"~'~~"+gf_fix_html_chars('+as_data_columns[i]+')+~~"~'~~" '
end if

if i=upperbound(as_data_columns) then
else
ls_val_expr+="+', '"
end if
ls_expr+=ls_val_expr
next
ls_expr+='+~~"]~~"+ if(getrow()=rowcount(),~~"~~",~~",~~")'

rtns=adw_ref.modify('c_export_dataset.expression="'+ls_expr+'"')

return 1

 

mike S Accepted Answer Pending Moderation
  1. Wednesday, 3 March 2021 15:08 PM UTC
  2. PowerBuilder
  3. # 1

Mark, we do the same.  As part of our application, we built a user graphing capability that works with any datawindow report; it uses a wizard to find out what/how the user wants graphed and they get an ugly datawindow graph out of it that we stick on the bottom of the report.   At some point we will review our graph options - use javascript based tools, or the pb ultimate suite graphs/gauges.  

I'd be interested in what you end up doing. 

 

Comment
There are no comments made yet.
mike S Accepted Answer Pending Moderation
  1. Tuesday, 2 March 2021 23:00 PM UTC
  2. PowerBuilder
  3. # 2

you should do all that in a datastore, it will be faster.

Also, strip out stuff you dont' need in  your compute's create. for example 

background.gradient

 

Comment
  1. Mark Jones
  2. Tuesday, 2 March 2021 23:47 PM UTC
Mike, thanks for the response, I agree about removing the extra attributes, but to be honest i only add one object so the cost is negligible, Regarding the database, this is going to be used by our clients to dynamically build graphs from existing datawindows that they have created. It is easier for us to give them a tool to assign a graph to the existing sql than having them manipulate the sql itself. However, adding code to build the dataset in the db might be a little faster. The slowest part of the operation now I have dataset creation faster is the echart graph rendering.
  1. Helpful
  1. mike S
  2. Wednesday, 3 March 2021 00:10 AM UTC
gotcha - yeah i thought you were adding a bunch of computes.



can you summarize your data and then send it to your echart? you might be able to build something in powerscript that takes the report data and summarizes it to a lot less than 18000 rows...



  1. Helpful
  1. Mark Jones
  2. Wednesday, 3 March 2021 01:35 AM UTC
Mike, to be clear here , our company offers a software product to customers. We are planning on building in better graphing capabilities than is currently available in PB. Our customers may throw any data they want at the graph. I just want to make sure we can handle large amounts of data without performance issues. The code provide with PB2019 R3 is too slow with large amounts of data so so am addressing this myself. Thanks for all your ideas!



Mark
  1. Helpful
There are no comments made yet.
  • Page :
  • 1


There are no replies made for this question yet.
However, you are not allowed to reply to this question.