duplicate records written to a file

Hi

We have an issue of duplicate records written to a file. This duplication is not always, but it occures once in a while.

The source is DB2 and stores each line of data as a separate record. The webMethods service picks these records and merges them into a single string, writing it to file… Then this flat file is FTPed to the appropriate location.A common java service is used to write this file which is being called from the main service. The java service seems to be working perfectly alright.
But we are not sure where the records are getting duplicated. The duplication does not happen always, We get this once in a while.

If anyone has idea on this, please revert back.

Can you verify that the file is in error before FTPing it? This is a long shot, but will start limiting the moving parts.

Are the duplicate lines always in a row or are they ever spred out in the output file?

Are you sure the problem is with the write happening twice, or could it be that a record gets appended to itself before writing?

Are there any errors in the error log?
When is the file opened, closed and flushed?
Is the file writer buffered?
Can these services be called from multiple threads at the same time?
What service are you using to get the rows from DB2?
WmDB? WmJDBC? You own JDBC code?

How do you loop through the DB rows? What would happen if there is an error during this loop?

I think the answer is under one of these rocks.

Cheers,
Fred

Can you just check, the logic for retriveign the records from DB. They might be getting duplicated there…or there might be parallel threads of the same service running, so that the records are picked up twice and if the file naming convention does not include the time till milliseconds it is very much possible that the data is gettign duplicated.

Also check whether the fromItem varibale in appendToDocumentList is dropped if you are using it at all.

Regards,
Pradeep

Hello,
Also, where in the file does this duplication happen. If it is always just the last line, or it is a line entry that normally be null, then like expressed earlier, you may not be properly dropping an assigned field in a loop.

If you file service is set to write with appending, then with the previous mention course grain file names by time stamp, you may see what looks like duplication. Though I would more assume that to be cross file over write unless a record set (number of rows that make a unique usuable unit) can be pulled more than once and only negotiates an early kill basis on prior existense of a file name by the stamp.

One possibility not mentioned is for a file in error, can you rerun the services an not produce the same error. If you get the same error, maybe the record lines are saved wrong (missing some uniqueness info?) or maybe the query does not provide for retreiving a distinct entry set. Good day.

Yemi Bedu

Hello,
Actually Pradeep did say about retrieving in error. Good day.

Yemi Bedu

Hi Fred, Pradeep and Yemi Bedu

Thanks for your replies!

A Java service is used to write the lines into a file and the options we can give are ‘append’ and ‘overwrite’

We always have to give ‘append’ option since the rows the selected from DB2 (source) can be huge records and hence we select 200 lines each time from db2 to write the content to a file and second time it fetches next 200 lines to append to the same file and so on…till all the lines are done. This is done inside a repeat step.

The dropping of the variables is taken care inside the loop where it forms a line to write to a file

We are using a JDBC adapter service (select SQL) to fetch 200 records from DB2.

We have created a scheduled task which is calling this service. I believe scheduled tasks are single threaded?

The dulicate lines are in a row (each line in a row).

There are no duplicate records in DB2 database as well.

And also that we are not able to simulate the duplicate error in test environment.

Little puzzled as what could be the problem that is creating this duplication

Hi Naveen,

Your assumption of scheduled services being single threaded is possible only when u have configured the scheduler as simple and checked the option “Repeat at the End of invocation”. Also if your scheduler service is configured to be complex there is option to control the mulitple invocations. It might be very much possible that your scheduler interval is less than the time for execution, in that case you will surely have the duplicates problem in case your scheduler is complex. Check your scheduler configurations and respond

Rgds,
Pradeep

Hi Pradeep -

The scheduler is configured to be single threaded.
It is a repeating task with the option “Repeat at the End of invocation” enabled.
The interval timing is set for half an hour.
The integration doesn’t take more than 5 mins for one thread to get completed.

Just want to know, Is there any alternative than using “psutilities:writeToFile” service.

Thanks,
Naveen