Best Approach to follow while using file polling in webmethods

What is the Best Approach to follow while using file polling in webmethods ?

Do we need to use broker to pulbish the filename and later subscribe the file name and continue the flow service ?

I tried to get the file from completion_dir in the processing service .
But the service fails and i think this is becoz service tries to check for the file even before it reaches it .
Hence planning to use broker to publish the filename and later subscribe .
Is this the best way to use along with file polling .
Please suggest .

File polling is different and Pub-sub is different…

In File polling: polling service will monitor the working directly periodically for file… it will not through any error even if file does not exist…Please post the error details.

There is no error that i receive .

I can see files moved from Working_dir to completion_dir .
Actually ,i my processing service is as follows :

Reads file from completion_dir
insert file into DB
Copy file to Archive_dir
Delete file from Completion_dir

Here files are niether moved to Archive and deleted from completion_dir after processing .

Hence i assume that issue if flow service is executing the get file even before it has actually reached the completion_dir.

To solve this issue i do a pub-sub.
Please suggest if any other better approach

Refer to Integration server Admin guide to know about configuring file polling port.
You dont have to add logic in your IS service to copy the files & delete the files. You can configure working dir, completion dir and cleanup settings to do the job for you.

Follow the advice of mja4wm and review the documentation.

In the service that is invoked by the file polling port you don’t need to do anything explicit to open the file for reading. The file content is provided to your service automatically. Specifically how it is provided depends upon the content type you’ve specified in the file polling port.

The file is moved to the completion directory AFTER your service has successfully completed. You do not need to be reading the file from the completion directory. The completion directory is effectively your archive directory.

my content type is text/xml and am expecting multiple files will be placed in the monitoring DIR at the same time.
In this case am listing out the files in Monitoring_dir and get file inside the loop to read all file .

Please correct me if am wrong .
Without explict readfile ,if we pass the node as input to the calling service ,how to process all the remaining files ?

yes i understand that passing the input as node and the nodeto doc will process all the files in monitoring directory .

But How will i trace it ?
not able to save/restore pipeline

Your service will be called by the file polling port for each file in the directory. Your service does not need to list the files in the monitoring directory. Nor do a get file.

Your service only needs to process the *node that is passed to it. It does not need to do any file management at all.

If haven’t done so already, please review the documentation.

Thanks Reamon and mj4wm for your comments.

i very well understand that we receive it for all files as node without any get file.

My 2 concerns with this approach is as follows :

1 Does this support large files of several MB?
2.Is there any way where i could save/restore pipeline and trace the processing of each files ?

Any help ?

1 Does this support large files of several MB? → you need to use stream based processing for larger files or if the file is XML format then you can use getNodeIterator flows review documentation.Also keyword search this form on dealing IS large file handling capabilities:

2.Is there any way where i could save/restore pipeline and trace the processing of each files ? -->Yes but not when dealing stream pipeline vars though.

HTH,
RMG

Use Test | Send XML File… to test/trace through your service.

Thanks Reamon ,but i have one last question (probably)

I think processing is based on order it arrives in Monitoring DIR.
How do i ensure that only after processing one file ,the next file is picked up .
As my service involved jdbc transactions to insert/update ,this processing onle file at a time and completion of previous transaction s very important .

How to I ensure this as i now face issues when trying to update as previous insert was not completed?

[INDENT]Thanks Reamon ,but i have one last question (probably)

I think processing is based on order it arrives in Monitoring DIR.
How do i ensure that only after processing one file ,the next file is picked up .
As my service involved jdbc transactions to insert/update ,this processing onle file at a time and completion of previous transaction is very important before proceeding to next file .

How to I ensure this as i now face issues when trying to update as previous insert was not completed? [/INDENT]

Is there any way to ensure the order of file processing is the same as it arrived in MONitoring DIR?

On the file polling port configuration, set the “Maximum Number of Invocation Threads” to 1.

I think the order in which the files are processed is undefined, though generally it will be oldest first.

Invocation thread set to 1 seems the apt solution for the ordering .

Thanks once again for your quick response and I think i can readily implement this approach.