We are using wM 6.1 on an unix server and trying to use pub.client.ftp:login / cd / get / logout services to get couple FF data files (size from 8MB to 60MB) from a remote ftp server daily.
My original flow design was a single logon then loop to get each one of file but somehow wM took too long to process all of them and lost connect after a large file processing (50MB).
I also tried to set watt.net.ftpconnTimeout and ftpDataConnTimeout to 720000 and re-logon/logout within each file loop but still got same issue.
Does anyone have experience to delivery large files via wM ftp? or I have to write Java services or use 3rd party libraries to handle this?
I’ve checked with our system admin and there is no any connection time or file size limitation has been set. Also, if I use ftp from unix command line to get these files, they just took less than 20 min without any issue.
Any suggestions?
Hi Rob, thanks for reply.
ftp:login > transfertype with active. no firewall in between.
ftp:get > transfermode with ascii. encoding with default.
Actually, with wM build-in ftp services, transaction speed is very slow. 40 MB file will take more than 1.5 hours to transfer. If I manually ftp couple files (total 140+ MB) in one time, it just took around 20 minutes.
Any suggestions?
Are you specifying a localfile parameter? If not, the entire file will be loaded into memory–which would account for the sluggish speed. So option one is to specify a localfile, which could be a network share, to save the complete file first, then process it as needed.
Another option you might consider using a “real” FTP server to move the file to a share then use file operations to open and process the file from there.
yes, I’m using localfile parameter, that’s why I didn’t get “out of memory” error (some of people got this) but “lost connect”.
for the 2nd option, I agree with you. Now I’m trying to use 3rd prarty ftp java lib from www.enterprisedt.com and looks like the result is very close to the “command line ftp”.
Thanks a lot for the helping.