CGI script timeout after running natural subprogram

hey guys, i’m facing this problem.

i have researched the net, and found that u can change the cgi timeout settings at IIS. i have already done that, in fact changed it to 3600s, but it will still timeout after 5 mins.

can anyone help? eric? btw, fyi i’m using IIS 5.1.

thanks.

and on top of that, can i make natural web interface more interactive and real-time? meaning i can output string using e.g. PERFORM W3TEXTLINE to the HTML as when my subprogram is running in the background. if this can work, then the CGI timeout won’t happen. i hope…

Is the timeout because of:
retrieving the data, or
processing the data, or
the amount of data?

How often is this “big” generation needed?
How many parameters do you need to generate this request?
Is it necessary to have a “actual” page or is a “older” page adequate?

May you can choose a different solution.

ok eric here is the summary of my subprogram. it will update or so call “sync” data from mainframe adabas to windows adabas, when i run it. fyi, there are huge amount of data being transferred between both different environment.

just to let u know, although there is a timeout, but the subprogram is still running in the background until it completes its tasks. the question is how can i overcome this timeout problem? i read something about “server push” technique on the net, can it be applicable to natural?

thanks eric.

If you only want to startup a long running job it is no good idea to let your browser just wait.

A better way is to startup a second Natural Process with call ‘shcmd’ or USR1052N, with

OPTIONS TQMARK=OFF
#NATSTARTPATH := '"c:\....\nderun.exe"'
COMPRESS 'PARM=' *PARM-USER INTO #NATPARM LEAVING NO
COMPRESS #NATSTARTPATH #NATPARM ': stack=(logon mylib;' #MYPROG #FILE #DATA ';fin) SCREENIO ASYNCH' INTO USR1052L.OS-COMMAND

You can pass parameters (#DATA - my url encoded to avoid blanks), and if you create a file with an unique name

MOVE EDITED *TIMESTMP (EM=HHHHHHHH) TO #FILE
COMPRESS "C-" *HOSTNAME '_' *PID '_' #FILE '.tmp' INTO #FILE LEAVING NO

you can control the process flow.

Create a empty file with unique name (W3CHECK-RESOURCE see creation parameter) and pass this file name as parameter to the called program.
Now you can check if your program finished:
If you can access the file and the file is not empty (W3READ-RESOURCE - error parameter), you can return the generated data or just display a ready message.
Otherwise your web program returns a page containing at the a

<META HTTP-EQUIV="Refresh" CONTENT="30; URL=' #URL '">

at the section.

Now your browser is in charge to automatically reload your page after 30 seconds (the url above should contains the unique file name as parameter).

If the web program is called again with the name of the unique file just check if your other process is still running (file not accessible or not running see above), then you can return again the “wait” page, otherwise your can return a success message and delete the unique file (W3DELETE-RESOURCE) and you finished without any timeout problems form bowser tcp/ip or what’s else in between…

You can also think of a cancel button - if your background program checks from time to time if the unique file still exists. If it has been deleted - just finish.

Now you have implemented a basic ‘web batch’ - may you need a queue mechanism to avoid to much running ‘batch’ programs, or a console using
W3LIST-RESOURCE to display all ‘running’ and ‘finished’ ‘web batch’ programs…

There is a error at the previous example, it has to be:

OPTIONS TQMARK=OFF                      /* do not translate " to '
*
#NATSTARTPATH := '"c:\....\nderun.exe"' /* use only runtime 
#PARM := "natparm"                      /* not the natparm used for the server start up
*
COMPRESS 'PARM=' #PARM INTO #NATPARM LEAVING NO 
COMPRESS #NATSTARTPATH #NATPARM ': stack=(logon mylib;' #MYPROG #FILE #DATA ';fin) SCREENIO ASYNCH' INTO USR1052L.OS-COMMAND

thanks man! havent been exposed to these natural system commands before. it’s very interesting…