Insert notification on Batch inserts rather than single records insert event

I have a requirement where there will around 1 million inserts on a table daily. I am looking to implement insert notification, but i don’t want notification document for every insert. Can we configure notification on batch of records like 15,000-20,000? So is there a configuration in Insert notification where I get the publishable document which has those 20,000 records.

If I follow the regular insert notification where i get an notification document for every inserts, then there will be million published document and million files on the target side for each record. I dont want that. I want to batch those records together.

Any help is appreciated!!

Hi Abhijit,

AFAIK, it is not supported out-of-the-box. Even database triggers work either on “each row” or “table level” but not as per fixed number of records. The reason for this is the execution of database triggers happen (before or after) per transaction. What you wanted to perform sounds to me more like a scheduled job.

You can schedule a job (either database job or IS scheduler) which can keep track of changes and publish the Result Sets in bulk when it meets your criteria.

Make sure you tune your message system and subscription trigger settings (Concurrent threads and Batch processing) to support subscription of such bulk messages in batch.

Let’s wait for the opinion from others too if anyone faced such requirement before.


There is an option on the select tab about this:

“Maximum Row:
Specifies the number of rows to be retrieved from the buffer table. This field is useful when you are working with a large number of records and you want to limit the number of documents sent each time the notification polls. Use a value of 0 to indicate no limit on the number of rows retrieved.”

Would this work for you?

Season’s Greetings

Thanks Gerardo,
Yes, i was thinking the same that i would put the Max Row count as 15,000 and set the polling notification to 1 hour, so that it would just send me Max 15,000 rows, since the source process will keep on dumping the data in the SQL DB, i guess 1 hour of time is safe enough buffer to pull 15,000 records per hour.

Thanks for your solution too. I proposed a PUB-SUB to my architect and since we have 4 servers, I thought we can have can consume the records on subscribing service in concurrent mode set to 4 threads, so 16 threads processing 15,000 records each. But he is apprehensive about too many threads running on the server and the server crashing.


Please measure your outcomes: does the processing time scales linearly? Or halving the batch size more than halves the processing time?

You also have the option of splitting the processing further.

Supposing the IO/CPU charge on processing each record is relatively large, you can split that 15k batch into smaller batches, publishing them back into the UM/Broker.

This way you could have the processing service installed on all your servers, all subscribing to those small batches, distributing the load automatically (care to share the queue using the same client prefix on the specific connection).

I don’t think (you have to check and test) notifications work correctly in concurrency (that is, having the notification enabled on several servers) even if you have a clustered environment.

Good Luck.

Season’s Greetings,