I need a service that will monitor an ftp folder and will archive a files what older than 2 months.
Can I achieve this with ‘standart’ instruments in webMethods?? (I can’t find any standart service what can read file attributes)
Or maybe exists another way to solve this problem?
If so, is there any example in the documentation or Java code?
The closest thing would be a File Polling port, but it does more than what you want. You’d probably have to create a service and then use the webMethods Scheduler (or some other scheduling mechanism) to have the service run on a periodic basis.
There aren’t many (barely any, actually) built-in services that do file manipulations and etc., but you can download PSUtilities from Advantage and that package has a few more sample services. If you do end up having to create your own Java service though, it shouldn’t be difficult at all. Run a Google search or consult the Java API at java.sun.com and I’m sure you’ll find what you need.
I think the file polling port actually does less that what you’d want–it doesn’t poll FTP servers. It only polls directories that are visible to the machine (local disks, network shares).
Unfortunately, none of the FTP services provide access to file attributes. You might be able to try pub.client.ftp:quote and the stat command to get what you’re after.
I’m assuming the folder is local (others FTP to it or the like.)
Most things can be done within webMethods, but not all things should be done through webMethods. (I often think that we are all too anxious to achieve things in webM, rather than asking whether should it be done in webM.) This sort of activity is typically better done at the OS level with scripting and cron.
That said, Percio’s advice also holds true. Java services and scheduler.
Oops. I skipped right over the “ftp” piece. Yes, if it’s an FTP folder, the file polling port is definitely not what you want (unless, of course, you can access it via UNC or something similar).
Take a look at the WmPublic/pub.client.ftp services. The service Rob indicated could help you in getting the file attributes depending on the FTP server. If you’re lucky and you’re dealing with files that have date stamps embedded in the name, then you may not need this service.
With all this said, I must say that Phil brings up a good point so consider simpler alternatives if possible.
So, after some brain storming and “standart” functionality analysis I’ve deside to write simple Java service to solve this problem:
IDataCursor pipe = pipeline.getCursor();
String fileName = IDataUtil.getString( pipe, “FileName” );
File file = new File(fileName);
fileDate = new Date(file.lastModified());
throw new ServiceException(e);
This piece of code returns file modification date, and then I with PSUtilities compare dates and deside what of the file from list must be archieved. To work with FTP I’ve use standart pub.client services. Maybe it is not the best solution, because I want solve same problem without Java code writing…
Before we continue, can you confirm the role of FTP in this integration?
Are the files being FTP’d to a directory that IS has access to? If so, then you won’t need to use the pub.client services and the Java code you have here will indeed be what you need.
Or are the files located remotely and IS only has FTP access to them? If so, then the approach of downloading them and using the Java service you posted will not work–because the last modified date will reflect the date/time it was written to the local disk, not the modified time it had on the FTP server.
In either case, like Phil I too would recommend using something other than IS for this task.
Side note: Your english is excellent, but the word you want to be using is “standard” not “standart.”
Hello reamon! Thank you for reply…
Actually I have two variances - directly accessing to FTP (and use for this pub.client services), or use configured alias and acces to FTP like shared folder… in this situation I really won’t need to use pub.client services, so I’ve re-assemble my package with this variance.
I agree with all of you, that for these aims it is possible to find more suitable instrument, but as is often - a decision depends not on me, but from a customer…
Also I have two another question: did you know, why ‘copy’ operation change ‘modified date’ to current date/time? Why in WM (also in best practice) ‘not desirable’ to use self-writed Java modules?
Not sure I understand what you mean by the “copy” operation, but I can weigh in with an opinion on the last question.
For the most part, Flow is easier to write and maintain than Java - at least to folks who don’t have significant Java experience. Since not all developers on a webM project will have strong Java skills, writing services in Flow allows for faster TTM and easier long-term maintenance (since more can maintain it). I can also teach someone to code good Flow faster than I can teach them to code good Java.
Debugging tools in the Developer are focused on Flow, making it easier to troubleshoot than Java services. No trace or step for Java in the Developer, so you have to use external tools.
However, sometimes Java is more appropriate than Flow. I try to follow two simple rules:
Rule 1) try to write it in Flow first, then write it in Java if Flow can’t do it, or the performance of the Flow service is terrible (and that performance is due to the flow code behavior, not to some external cause)
Rule 2) if you write it in Java, keep it simple. If you have three things you need to achieve, that’s three small simple services, not one big one. If you are modifying a string in a document, set your input to be just the string and use flow to get the string out of the document, then pass it to your Java service. This keeps your Java pipeline and logic code cleaner and simpler.
Keep in mind these are just my rules, not defined best practices from SAG PS.
Can someone provide assitance in generating a list of elements/services currently locked to specific users?