we have a requirement where one of our Partner will send a huge volume of data via AS2 to our Routing instance where we recognize the document and will route to the IS instance.
The IS instance will then map that xml to broker document and will publish. The target integration subscribes to this document and will post it to SAP via RFC.
Now, we have 2 things. One is how to handle this much of huge volume in Trading Networks? I have heard of large file handling and did some research too but unable to understand. Can someone help me understand that.
Second, How to avoid this much of huge data to be published on Broker? I have seen threads on chunking the data and publishing it. But how does SAP or target systems knows that the transaction has been done( I mean say if we chunk a transaction into 5 publishable broker documents, then how does SAP knows that this transaction has been split into 5 ).
Please help me in this regard. Its very important. Any help is much appreciated.
You don’t indicate how big “huge” is. That may be helpful.
A common approach is to not publish the data directly but instead put the data into a common spot. Then publish a document that indicates “there’s a document ready over there.” The subscribers would then read the file (use stream techniques and/or node iteration to avoid loading the file completely into memory) and do the necessary work to move the data on.
Hi David,
As reamon rightly said put the data into a common spot and read that process that data.With my understanding you have two options
1.Store the XML data in the DB tables (As a CLOB in the table) and publish a metadata document of that data to broker.With the metadata avilable your IS service can read the data from DB and do necessary processing.
2.Store that XML data as a file in a physical location of of your Integration server and publish a metadata document of that data to broker.With the metadata avilable your IS service can read the file from the physical location and do necessary processing.