FTP mput and mget

Hi All,

        I am writing a flow service(WM 6.1 version) which pulls the files from server "A" and places them on server "B" using FTP.         Using pub.client.ftp.mget to pull file(s) from the source system and pub.client.ftp.mput to place the files on to the remote system.The problem is, I don't want to dump all the files which are from system A to the local storage and delete the files after placing them to remote.Is there any other alternate way for doing this ? 

Thanks,
Prashanth.P

Prasanth,

Once your files processed from system A then move them to Archive folder append with datetimestamp,and incase any errors move that particular files to Error folder depends on the FTP return status or transient errors.

This way you will have hold on reprocessing the errored documents once they are corrected.

HTH,
RMG.

Gupta,

Thanks for the response.

Let me clearly explain my problem.

file1.txt, file2.txt exits in system A.
Flowservice get these files into localsystem and disconnects session.

file1.txt and file2.txt are placed to remote system and both the local files and the source files will be deleted.

Now I don’t want file1.txt and file2.txt to get copied to local harddisk where Wm is installed, instead directly transfer from source A FTP directory to destination B FTP directory without involvement of localfile system.

I searched through some of the other built-in services but we have to use only the get command instead of mget, where I have to close FTP session for each file and open another FTP session for sending each file. For the second, third …files we have open and close sessions for source and destination.

By doing this if there are 100 files in the source system, there will be 100 sessions opened for source and another 100 for destination.

Any inputs ? What would be best practice ? mget or get ?

Regards,
Prashanth

Danny,

Job postings are not allowed in this forum section,please re-post it in the JOBS section of this site,you will get replies.

HTH,
RMG.

Gupta, Any thoughts on my above explanation ?

Regards,

If you are transferring a batch of files then mget is the better option or else do list of files on the inbound folder and use the get service.
Even though the connections are opened once the ftp success you are terminating the sessions,but still your code should be roubust for making transport process fine.

Just some thoughts,

I agree that batch processing(mget) is better way of transferring, but the only reason why I don’t is, I don’t want to dump it in the local folder.

So the only way I found for this is using a get command and no. of sessions opend and close for each file listed on the source system.

And now, I have written both the services to check the performance.
But don’t know how to check the performance.

Please let me know how to do this.

Regards,
prashanth

“I agree that batch processing(mget) is better way of transferring…”

The definition of “better” depends on what you’re trying to do. If the storing of files in the local file system is not desired, then mget is not better, it’s worse.

Effeciency-wise, there is not all that much difference between doing an mget or an ls and looping to do multiple gets.

“where I have to close FTP session for each file and open another FTP session for sending each file.”

Why do you have to do this? Would this approach work:

  • login to FTP server A
  • login to FTP server B
  • cd on server A if needed
  • cd on server B if needed
  • ls from server A
  • loop over dirlist
    **** get file from server A
    **** put file to server B
  • logout from server B
  • logout from server A

A key part of this working is to manage the pipeline appropriately so that the FTP sessions don’t get mixed up. This is probably most easily done using scope on each service call.

With this you’ll have 1 session to each server and effectively be acting as a file transfer bridge. (Even more efficient would be to have a script on FTP server A simply push files directly to server B–would that work?)

Prasanth,

Hope You got whatever you are looking for,please follow Rob’s posting using that way you’ll have 1 session on each server and ultimately once all the files are processed then logout the session at the end.

Rob,
Thanks a lot for the idea, that is what I wanted.

   Couldn't think that I could open two sessions at a same to two different location. 

   Anyway this solves my problem.  

   Again, thanks for the idea. 

Regards,
Prashanth

Prashanth,

I’m guess that the requirement not to leave files on local server is a security concern? One of the alternatives I can think of is to use OpenSSH. There’s two components: both an OS-level package (available for most *nix platforms and Windows), and a webMethods provided package (free, and unsupported). In the OpenSSH trial of protocols, there’s scp (Secure Copy) which can do remote-to-remote copy. If you have system administrator knowledgeable about OpenSSH, it’s worth a try. It’s secure (encrypted), and can do batch transfer (like mget) too… Search wMUsers for OpenSSH…

Yuan,
Thanks for the update.
The reason for not dumping the files locally is to avoid too much disk activity where we transfer nearly 1000+ files everyday.
Will get back to you if something I need from OpenSSH.