What is the best approach to transfer a large file from one resource to another via WebMethod 6.0. Both resources talk to Integration Server via adapters. According to my understanding, webMethods has an inbound FTP server which can receive files, but not send them. Integration to a FTP server is required to move large files that you don’t want to translate into a message. What is the best approach to handle this situation ??
For example I need to perfom following operation :
I place an order online and attach to it a 100meg pdf file. The order will be translated into a message with the filename as one of the fields. What is the best way to move the 100meg file to the branch necessary?
webMethods Integration Server can indeed send documents out using FTP. Review the pub.client.ftp services.
IS can “move large files that you don’t want to translate into a message” but it takes some work. (On a side note, there is no inherent need to translate anything, large or not, into a message to be successfully processed by IS.)
You can send and receive large files using the services within IS. Just be sure to use streams and never load the entire file into memory at once. Review the large doc handling documents for guidelines.
We are having the same issue here where I work. I have seen that if a file is over 8mb. The file times out trying to process through our B2b gateway. I also would like to know the answer on how to process large files. I have been splitting the file in 2 and the file processes without error. This works fine, however it is poor administration in trying to automate.
Bob How are you transfering your file at present ?? Our client has a requirement to transfer file whose sizes are much more than 8 MB.Are you using flatFile adapter to transfer file ??
These docs have information about large document handling:
Trading Networks Large Document Handling
webMethods EDI Module: Core Component
webMethods RosettaNet Module
Each of these describes how these components handle large documents. You can derive a custom approach for your own use from the information provided. Hope this helps!
We would like to use FTP protocol to get the files from our partner into our DMZ server, we have to do simple get(from partner) and put(in DMZ server). We want to schedule this service once in day, and we have to get multiple files with different sizes(max file size 10Mb) from the partner.
Does IS handle this? or Do we need to do anything, other than just get and put?
UMG, webMethods IS has a built in scheduler that can do everything from basic to complex scheduling…so it won’t have any problem scheduling a service to run once a day. You can access the scheduler from the webMethods admin web pages (under Server, then Scheduler). For more information on the IS scheduler have a look at the IS Administrator’s Guide (starting on page 295 in the v6.0.1. guide)
Regarding your question about FTP’ing large files, please re-read Rob’s posts above.
We will be receiving large files via FTP stream to an IS service. It sounds like other people have seen problems and it was mentioned in this thread to
“Just be sure to use streams and never load the entire file into memory at once.”
Can anyone expand on this tip or what are some of the other tactics to handle 5,10,15 mb files that are FTP’d TO webMethods?
I wrote a Perl script to split up the large file into 240800 byte files. You would have to test your system to find the optimum size for you. Then I wrote a Flow to split the file, do a DIR command to get the file list of all the files ending in .seq and then invoke my main Flow with each of the 20 or more smaller files. I also used memory cleanup in my flows each time I dropped a large StringList or RecordList.
Note that a Perl interpreter is included on most Unix systems. There are free Perl interpreters for Windows available at CNET.com.
Here is the script:
#!/usr/bin/perl -w
use strict;
where’s the input data?
my $infile = ‘INBOUND.FILE’;
how many bases in each split?
my $splitsize = 240800;
open( IN, “$infile” )
or die “Can’t open Employee file ‘$infile’: $!\n”;
for( my($pos,$data,$got)=(1); !eof( IN ); $pos += $got ){
note: no gulping - instead ingestion via a small teaspoon
open( OUT )
or die "Can't open Inbound file '$file': $!\n";
print( OUT ">$file\n$data\n" )
or die "Can't write to Inbound file '$file': $!\n";
close( OUT );
} else { die “read on ‘$infile’ failed: $!\n”;
}
}
I am new to webMethods and I am having the following problem: My service uses the “pub.client.ftp.ls begin” to log in the directories and check if any vendor files have arrived. More recently there have been failures on the FTP servers, which causes my services to hang, as it waits for response from the FTP site and does not get to the “pub.client.ftp.ls ok”. Is there a way i can eliminate this problem, since even after the FTP comes up my services are still hanged. Thanks in advance.
Basically you have to receive the flatfiles using FTP or Email or filepolling (ffData)from the source to a flowservice and then push it to TN using routeFlatFile service if you are using IS6.x version.for pre webMethods versions you have to use custom content handlers and invoke a flowservice that will route it to TN.
Thanks for your quick response. I’ve one more question, Can I use file polling on a remote server (ie the partner). I do not see the information in the file polling port to connect to the partner and poll on a particular directory.
Hello,
Depending on how there security needs to be, you could have a client program that will use the WM API and run local on there computer and send data to your custom flow service. It is not a lot of code to do the sending and it uses WM authentication.