we have a third party integration developed using openssh SFTP implemenation.
our integration is using pub.openssh.sftp.
brief explation of our integration
we get files from external partner GXS–> using sftp–>to webMethods(scheduler runs for every five minutes)–>get all files to local folder and delete them in remote folder
problem is with remove command…
gxs has also schduler which keeps files in monitor using scheduler.
for example scheduler runs for first time and takes 5 mins lets assume… before excuting rm command if gxs keeps some more files in folder…, our inital schedulker is removing all files from remote folder which are not processed aalso.
please suggest any alternative approach?
how to write script if anyine know please let me know…
If remote server is removing the files…, if in a worst case if our scheduler doesnt pick up files then remote server will remove the files from folder…
Its better you change your design. Instead of deleting, you can move the file from original FTP location to another processing location in same FTP source and then you can get that file to your IS. That would help you keep track of files processed and also it will prevent coinciding during the process of deletion. You can use sftp rename and get command simultaneously for that.
Note: They can create script to delete the files from that processing directory periodically to prevent accumulation of files.
Hi Deepti, After processing the files from remote sftp server , you can move the processed files to archive or processed directory in your local directory and delete from the sftp server. So that by any chance if you want reprocess you can get it from your local directory