Scheduler is not working sometimes and making duplicates in the target system sometim

Hi all,

This is the problem I am facing.

This is how the flow is for this interface. Trading Partner places files on the FTP server from where it is picked and put into the webMethods adapter IS host XXXXXX. There are some scripts which rename the files and decrypt it after which it is pushed to the edi box. There are some schedulers configured on IS port which processes these files and puts it to the backend.

This has to be done automatically by a cronjob on the same adapter IS host. But due to some reason or the other, it dosent happen as it is supposed to, leading to the below problems.

1.Scheudled job is not processing files on time.

2.Sometimes even the job processes files on time, data is getting duplicated at the target.

We have checked the cronjob script and there was no flaw in that. Kindly help in getting a resolution.

Thanks and Regards,
Yadhu.

When you are saying schedulers are not running fine, what is the behaviour you are experiencing? Is schedulers not running, running with failures. Do you see any logs for the same on IS. You might increase IS logging level to grab useful information.

Do you have the files deletion/archiving procedure in place after inserting to backend, so that same files doesn’t get processed again. If yes, doublecheck if its working fine?

Are you using the file polling port?

The behaviour of the scheduler is it is posting the files twice in the target system which should not be happening. Since the service we are invoking should push the files once.

Yes, file archiving procedure is in place after inserting to backend. Still it is not happening and pushing the data twice at the target end.

Does the file get deleted/moved from source once its picked up? is it possible to share the flow logic?

How often does the scheduled task run? Is the “Do not overlap” flag turned on? How long does one run typically take?

I suspect that the processing of the file is taking longer than the time between runs and the invoked service is picking up the file twice because it is being run twice.

If this is the case, there are a couple of approaches:

  1. Enable the “Do not overlap” flag on the scheduled task.
  2. Run the scheduled task less often to give a single run plenty of time to complete.
  3. When the service starts, look for a file to process and when found, move it to a work directory. That way if the service is run again (by accident or intentionally) it will not even see the file a second time.

I have a fairly high degree of certainty that it is not the scheduler that is the issue.

Hi Reamon,
The scheduler is running every 1800 sec to check the file. Yes, “Repeat after completion” flag is enabled. The files are taking nearly 45 mins to process. :confused:

Hi Shahid,

The files will be deleted only after it gets processed.

This is how the flow is.

Trading Partner will put the files in FTP server and there is a Cronjob running, which should pick the files from FTP and place in Adapter IS then to Main IS. Once the files are placed in Main IS, the three schedulers will run and pick the files, process it and then place it in backend. Those three schedulers will run once in a day to do this job (since TP will post the files once in a day).

Thanks

What is leading you to state “Scheudled job is not processing files on time?” If it is scheduled to run every 30 minutes, and a file takes 45 minutes to process, then the scheduled task will always be “behind.”

In what ways might the service fail? And when it fails how do subsequent runs avoid (or not avoid) processing a file a second time? Is there any logging to track the service’s progress?

The key to avoiding processing a file twice is to move it out of the polled directory immediately when the service starts. That way you won’t need to have complex exception handling code to move the file when an error occurs. The file polling port, which you should consider using if you’re not using it, does this for you.

The thing here is TP posts the files once in a day. Eventhough the scheduler runs every 30 mins it will pick the file only after the TP puts the file in FTP and then it will process. The strance thing here is, this task is daily task and the duplicates are happening once in a week or two (not daily) . Also this is in Production and it is becoming really difficult to check where the problem lies…

Please find the attachment for the flow.
FLOW.jpg

We need to know specifically how the files are being processed by Integration Server.

  • Are you using a file polling port?
  • When your service is launched, how does it determine which file(s) to process? If it process each file present in a particular directory, that may be contributing to problem.

You state that the file is moved after it is successfully processed. What happens if the service encounters an error of some sort before it is able to archive the file?

Sorry for the late response Reamon,

  • We are not using file polling port here.
  • The scheduler will check the EDI box and picks all the files which are available.

Since this issue is not happening in a regular interval, our doubt is there should not be any issues with webMethods. To make sure that we have written a cronjob script in EDI box to check for the files which are coming from TP. The cronjob will check the files in the EDI box and if there is any duplicates occurs then it will move it to some other working folder and will alert the support group. By this way we are trying to eliminate the error nodes. Waiting for the issue to occur again to check the exact source of the problem. Will keep u update about this issue. :o

Thanks for your help.

Thanks,
Yadhu

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.