We have a requirement where the client keeps the files on the Unix directory and the files will be pilled up every day. I need to pick the files from that Unix directory and processes one by one. It’s nothing but I have to load all the data from the flatfiles to the Oracle Base tables.
Can anybody please let me know how to make this process work?
My requirement is something like this:
Login to an remote server, pick multiple files from one of the directories, and process/load all the files one at a time to Oracle Financials. Once the files are processed, I have to archive those files to some other directory that were processed.
Can you please give me an idea, how to handle this case? How do you know oldest/newest files and the order of the files etc…
Thanks in advance for your help.
It all depends on how you design the complete solution. On the source side how you are getting the flat files you need to make sure the files from source are following some naming conventions (eg: order.txt )
Ask the business group if this is possible. Then you can use ftp (different server) or file polling (mapped network drive).
For archiving you need to make sure that the file is processed correctly.
Follow the webMethods best practices approach of try/catch.
Many people in this group has implemented solutions around it. Try searching for other posts and see if you could figure it out.