Sharing FTP Session

Hello All,

I get IDOCS in webMethods from SAP. They will be converted to XML and sent to one FTP location. Now the IDOC rate is so heavy that per day we are getting more than 15000 idocs. For each idoc we are doing a sequence like this:
ftp:login
ftp:cd
ftp:put
ftp:logout (Even if I use single ftp service in wmpublic, that will do these operations internally). The service is doing the login and logout for more than 15000 times a day.
Now I want to login once and using that sessionkey do the ftp operations. As I can continue with a single session when thats not timed out, I can do login and logout only a few times, use the alive session to do FTP put.

Beacuase ftp login is consuming more time and I don’t think its a good idea to repeat unnecessary steps (If we log into FTP through browser or commnd prompt, we don’t do login for each FTP operation. We can do n number of things on a single session, when the session is alive)

Did any of you work on a similar requirement? I am getting some problems in the implementation of the above idea; like when one service CDs to one directory, the other service using same session has to know that and there will be conflict between two to do any operation- for example another CD to a different directory)

Any thoughts on this?

Cheers
Guna

"Beacuase ftp login is consuming more time and I don’t think its a good idea to repeat unnecessary steps "

While it seems intuitively obvious that logging in and out will consume more time, have you done any measurements to see if the time consumed is meaningful? Is logging in and out a real problem?

Since IS is a multi-threaded environment, and you have little to no control over those threads, you have two choices that I can see.

  1. Continue with the path you’ve begun and use a single FTP session for all threads. To avoid conflicts, you’ll need to serialize (synchronize) access such that only 1 thread can be doing FTP at one time. You’ll also need to make sure that each FTP operation leaves the FTP session in a good state for later thread use. Also, you’ll want to avoid CD commands using relative paths (which may be a problem, depending on the FTP server and how things are set up).

  2. Implement an FTP connection pool. This provides a configurable set of persistent sessions that can be sized as needed. When a particular integration process needs a session, it borrows one from the pool, does the work, then returns the session to the pool. The session will still need to be left in a good state for later use.

Option 1 won’t scale very well. Option 2 can provide a nice balance between scaling and logging in and out all the time. For option 2, you’ll have to implement your own pool, possibly using open source that can help. If it were up to me, I’d simply log in and out all the time unless there was a compelling reason to not do so–then I’d go with option 2.

Or you could write a batch FTP process. Although you’d also need somewhere to hold the data temporarily, some sort of queue.

One option would be to write a routine that saved the data to disk, using whatever the filenames were required by the destination FTP server.

Then write a scheduled task to poll this directory periodically and push all the files there in one FTP session, login, put, put, put, logout.

There may be better ways to hold the data than extracting it to a directory, but that seems a relatively simple option.