can you please help me out to resolve this issue?

I am trying to write the X12 structure as soon as I populate a record. For e.g. as soon as populating the BEG segment, i am trying to write the segment to file. But the convertToString service is also writing a ST record in addition to the BEG record with the control number.
Please let me know if there is any way to avoid this ST segment being written to file.

Also, would like to know the usage of “Start AT” parameter in the convertToString service.

Appreciate your help in this regard.
Thanks & Regards,

There are a lot of PDF files available for EDI in advantage. You should get your answer in those pdf.

Sorry to inform you that I referred the PDF files in EDI advantage site, but was not able to find answers for my problem (WMEDI - Built in services pdf)

Thanks & Regards,

Please elaborate the problem…
what is the source of ur EDI. r u using TN to receive the EDI?
what is the exact requirement? do u want to write the whole EDI data into one file or each segment into different files?

Looks like it is a typical requirement…What is the reason behind for writing each segment to a file??

StartAt Parameter:

[B][FONT=Arial Narrow][SIZE=2]

String [/b][/size][/font]Allows the convertToString [FONT=PalatinoLinotype][SIZE=2]service to start at a specific record in the flat file
schema used to create the output string. Specify the path to the element where you want to start composing the output string.



Thanks for the reply.

The reason behind writing each segment to a file is to decrease the load on the services and increase the performance.

Can you give me an example what is the value that is to be given for start At parameter, as I have explored all the possible options for this. But could not get how to use this parameter.

Lets say my EDI structure is in this format

|-----------BEG (Child of ST)
|-----------N1 (Child of ST)…

Now I want to write out BEG segment, can you tell me what is the value that I have to give for startAt parameter?

Thanks & Regards,

Hmmm. How large do you expect one transaction set to get? Typically (but not always) transaction sets are not large enough to have to worry about this. It is receiving large interchanges and trying load those completely into memory that is usually the issue.