Downloading a file from website

Hello Guys!!!

I have a situation in WM that I have to login to the following site “[url=“http://”][/url]
and manually access the following folders on the left side of the page, select Accounting Reporting/Transaction Report.
That will take you to a page that allows you to select what information to extract for the file.(Please look at the sample data below)

Basically login to the above site and click the “Accounting Reporting/Transaction Report” folders where it will take you to the
different screen and pick the file in what ever format you would like to be based on the dates.


Is there anyway I can make this in WM. I know it needs manual intervention.

Thank You guys in advance!!!


Are you asking how to process this file once you get it into webMethods or are you asking how to get this file into webMethods from the external web server?


Hi Mark,

I am already processing this BAI2.dat file from one of the UNIX directory. I am fine with this.

I am looking for something like this
Go to a website “” site click one button which will take you to another screen and select the file format we would like to have. We can select the bank file in different format, but in my case I need in BAI2 format as mentioned above.

I am not taking about FTP’ing to a remote server and getFile but to login to a website and download which requires a manual intervention.
Did you get Mark? Please let me know still if you didn’t catch my point.

Thank You very much!!!

I’m not sure, but it sounds like you want to create a new web page that will help automate the process of logging into the external site, navigating through a series of menus and downloading a file in the desired format. You are OK with some manual steps, but not as many manual steps as you have now.

If this is correct then I don’t think this is a webMethods question, but rather a web development question. A few releases ago there was something called WebTap that basically screen scraped a web page, but that’s been gone for some time now.

I would suggest contacting the developer or support contact for the external site to determine what, if any, options they offer for automating the process. Perhaps they already provide a web services-based interface that will let you retrieve the data directly rather than by navigating through their web pages.

If they don’t offer anything besides their ASP-based web interface, you may be able to use MS Visual Studio to create a new web page that would reduce the manual intervention. That said, that option seems like more work than it would be worth just to save a few steps.



Priyatham - As Mark said, contact the site’s developers to see if they offer a more machine-accessible interface. The advantage is changes to such an interface are coordinated with its users, but changes to a public website’s are not - even a small HTML change has the potential to break a web-scraper.

If not, try knocking up a simple sequence of pub.client:http calls in a Flow service - this can give you what you need. However, you’d need to understand HTML and reverse-engineer the HTTP POST variables for each page: do-able but nasty.

Alternatively, you can use an external tool called from webMethods.
I’ve used JMeter ([url=“Apache JMeter - Apache JMeter™”][/url]) for load testing, but it may be able to do what you want. JMeter has a proxy server which can record your browser’s interaction with a website, then auto-generates the “test plan” – code to do the same thing automatically. One drawback of it’s proxy server is it cannot records HTTPS interactions though.

If you use JMeter, I’d expect you’d need to use it’s “Regular Expression Extractor” listener. Also, since it’s Java, you may be able to embed JMeter into IS – there was some talk on it’s mailing list about embedding JMeter into Tomcat but I don’t know how far it went.