1. Narayana Bhat P
  2. PowerServer
  3. Saturday, 22 June 2024 04:37 AM UTC

Hi Team,

 

What is the best method to import high volume text or excel file. For example having column 20 and no of line 50000

 

We have tried using datawindow import and reading 100 line each loop till we get datawindow import error, but this works perfect on powerbuilder but hanging in powerserver and terminates application once session over

 

Narayana Bhat

Miguel Leeuwe Accepted Answer Pending Moderation
  1. Sunday, 23 June 2024 11:25 AM UTC
  2. PowerServer
  3. # 1

Does your server have enough memory RAM?

Comment
  1. Narayana Bhat P
  2. Monday, 24 June 2024 03:40 AM UTC
Dear Miguel,



Yes, Server is setup with Ubuntu Linux 22 version. Currently server has 5 concurrent user session only



Cpu : 2

Memory: 4gb



Narayana
  1. Helpful 1
  1. Miguel Leeuwe
  2. Monday, 24 June 2024 05:08 AM UTC
Hi,

Yes I think that should be enough.
  1. Helpful
  1. Armeen Mazda @Appeon
  2. Tuesday, 25 June 2024 23:07 PM UTC
The import happens on the client side into the DW buffer so server-side memory doesn't matter.
  1. Helpful 1
There are no comments made yet.
Narayana Bhat P Accepted Answer Pending Moderation
  1. Sunday, 23 June 2024 03:55 AM UTC
  2. PowerServer
  3. # 2

Here is the code sample

 

	long ll_start_record,il_batc_record,ll_cnt,ll_end_record,il_column_count
	il_batc_record=100
	il_column_count = long(dw_import.Object.datawindow.Column.Count)
	DO while ll_error <> - 1
		ll_cnt++
		
		if ib_stop = true then
			ll_batches = ll_cnt -1
			exit
		end if
		/*Getting Start Record*/
		ll_start_record = (il_batc_record* (ll_cnt - 1))+1
		
		/*Getting End Record */
		ll_end_record = il_batc_record*ll_cnt

		/*Resetting the Data window before inserting the records*/
		dw_import.reset()
		
		/*Importing the records into data window based on the batches*/
		ll_error = dw_import.ImportFile('import.txt' , ll_start_record,ll_end_record,1, il_column_count, 1)
	 
	 	/*If file Reaches  the end of the line than exits from the loop*/
		if ll_error < 0 then
			If ll_error = -2 then
				MessageBox("Oops","Empty File Found")
				exit
			else
				ll_batches = ll_cnt -1
				exit
			end if
		end if
		/*some validation and data setting to update dw goes here */
		/* update the data to database */
	loop
Comment
  1. David Peace (Powersoft)
  2. Wednesday, 26 June 2024 14:54 PM UTC
You have not show the update code and commit points. Do you commit within each loop?
  1. Helpful
  1. Narayana Bhat P
  2. Thursday, 27 June 2024 03:25 AM UTC
Dear David



We are updating the data each batch.. i.e. on above example it is on each 100 records completion. Post that next set will be imported to datawindow and so on till end of records exist in file
  1. Helpful
There are no comments made yet.
mike S Accepted Answer Pending Moderation
  1. Saturday, 22 June 2024 15:07 PM UTC
  2. PowerServer
  3. # 3

check the PS time out settings.

 if you want to run it in PS, make sure the process commits while processing.  so if you are importing 50000 items, each of which is a separate thing, then do a commit for each of them, rather than load all 50,000 and then commit at the end.  If you wait to commit until the entire file is processed then it will often time out.

also, if you are doing a lot of validation of the data by retrieving data from the server, that will slow things down.  look into caching that data.  review the processing in general you should be able to get it to run faster by changing up some of these things. 

 

Ultimately running it on a server as Chris suggested is the fastest way to process.   You are doing a large volume of transactions so uploading the file to a machine local to the database will always be the fastest.  You can write your own, along with a console application.  or you can look into using  Topwiz Software - TopwizWeb (topwizprogramming.com)  to do it all in PB.

Comment
  1. Narayana Bhat P
  2. Sunday, 23 June 2024 03:56 AM UTC
Hi Mike,

Thanks for the suggestion, as you explained we are goind db update on batch of 100.. i have attached sample script as well



the same script work very well in powerbuilder but same fails in powerserver.. i don't know what is the issue
  1. Helpful
  1. John Fauss
  2. Tuesday, 25 June 2024 14:27 PM UTC
Hi, Narayana - PowerBuilder and PowerServer work very differently under the covers. Your statement that "I don't know what is the issue" leads me to believe you may not fully understand and appreciate these differences.

I suggest you familiarize yourself with them:

https://docs.appeon.com/ps2022r3/Database_connection_concepts.html

https://docs.appeon.com/ps2022r3/performance_guide.html
  1. Helpful
There are no comments made yet.
Chris Pollach @Appeon Accepted Answer Pending Moderation
  1. Saturday, 22 June 2024 11:59 AM UTC
  2. PowerServer
  3. # 4

Hi. Narayana ; 

  Suggestion:  I would change the system design to FTP these external files over to your server. Then run your PB import App as a C/S application on the DB server connecting directly to the DBMS as before. That way, the high volume interaction could use native DB client connectivity for exponentially better performance. 

Food for thought. 

Regards ... Chris 

Comment
  1. Narayana Bhat P
  2. Sunday, 23 June 2024 03:58 AM UTC
Hi Mike and Chris,



thanks, can i get some sample application on above said method, so that i can try on this way.. this looks promissing



Narayana
  1. Helpful
  1. Chris Pollach @Appeon
  2. Sunday, 23 June 2024 15:39 PM UTC
FYI...

Example of Paid product: https://www.example-code.com/powerbuilder/ftp.asp

Example of Free code: https://www.topwizprogramming.com/freecode_ftpclient.html

HTH

Regards ... Chris

  1. Helpful
  1. Narayana Bhat P
  2. Tuesday, 25 June 2024 04:35 AM UTC
Dear Chris,



Thanks, i will go through both FTP and API method
  1. Helpful
There are no comments made yet.
Miguel Leeuwe Accepted Answer Pending Moderation
  1. Saturday, 22 June 2024 10:56 AM UTC
  2. PowerServer
  3. # 5

Try FileReadEx() ?

Try creating a datawindow and use ImportFile() ?

Try creating a macro in Excel that inserts the data into a database ?

Comment
There are no comments made yet.
  • Page :
  • 1


There are no replies made for this question yet.
However, you are not allowed to reply to this question.