problem using savePipelineToFile / restorePipelineFromFile

Unable to do explicit mapping from a xml file saved as a result of pub.flow:savePipelineToFile. I tried using pub.flow:restorePipelineFromFile which removes all the valid IDOC segments from the file and the resultant xmlData then becomes useless.
Is there a way to get around this, so that be able to map data explicitly from IDOC-xml segment to another.

(What does the pub.flow:restorePipelineToFile do? Couldn’t locate the service in SAP BC 4.7).

thanks,
Sanjay…

Sanjay,

Go to advantage and download a PSUTilities package. It has a lot of useful services that probably could help you. One of it is writeBytesToFile (If I remember correctly). It writes bytes into a text file format. (or you can write java service to do this, or probably you should, by using PSUtilities as examples)

So, you can convert your xml docs to bytes (there’s a service that does this in pub) and save to file. You can recover again with getFile from pub package

Kurt

Thanks kurt but i got around this. First, I used the service savePipeline to save on IS and then used the same name value of the above service to save the pipeline using savePipelineToFile.
And, there after the scheduler service uses file listing service to list files from the directory and loop on the listing to pass the filename as the input parameter to the restorePipeline service to fetch the xmldata which is retrieved exactly in the desired format. I know there is a draw back that it loses the saved data in IS on server restart.
Please, let me know if there are other such drawbacks.

regards,
Sanjay…

Am I to understand that you’re using savePipelineToFile/restorePipelineFromFile as a normal mechanism for integration solutions? Normally, these are only used for debugging purposes. There are a variety of issues with using these services for “real” work.

What are you trying to achieve with this “save pipeline to memory, schedule a task, read pipeline from memory” approach? At first look, this seems unnecessary and fragile.

Hi Rob,

i am trying to retransmit(via ftp) a idoc based xml file which failed several attempts of FTP successfully at any given point of time.
During a failed FTP transmission the file need to be saved somewhere inorder to retransmit those later. savePipelineToFile raises issues as above.
And, apart from savePipeline To memory, I can’t think of anything to overcome this. I know there is a risk associated with it but i am also using savePipelineToFile to keep track of the failed transmission. Any better and more viable approach is most welcome.

thanks,
sanjay…

For starters consider the following options:

  • After some number of retries, store the data in a CLOB or BLOB column in a database table with data like timestamp, destination address and status. Build a scheduled service that checks the table on some interval and attempts to send any files in pending status.
  • After some number of failed retries, write the data to a flat file in a folder on the file system. Use a file polling port to attempt to resend these files on some schedule.
  • After some number of failed retries, publish the data in a guaranteed document. Use a triggered service to attempt to resend the files on some longer interval.

Mark

Trading Networks.

I’m experiencing a different problem. I’m testing the flow service exposed as a web service i developed from SOAP UI by passing the SOAP request to IS. I’m getting back the response successfully but nothing gets saved to pipeline.
Soap UI → Soap wrapper → main flow
I’m putting debug steps savePipelineToFile, restorePipelineFromFile in the main flow (input is the soap request document, output is the soap response document).
After i pass the request from SOAP UI, when i want to debug my main flow, i cannot retrieve any pipeline contents from the pipeline. I gave the exact names to both save and restore services.

Any help is appreciated.

I researched for a while and figured out that the soap request i’m passing from SOAP UI is not good. I tweaked the request and my problem got solved.

Thanks,
Hope.