Implementing Singleton for regex searches.

I have a requirement to do a regex search and implene this:
Pattern pattern = Pattern.compile(regex);
as part of the process.
I want to leverage the singleton property to compile the regex just once each time I come across a new regex and not to compile each time.
Is there a known way of implementing this in webmethods like a singleton in java?

Appreciate all your help !!

You could use a HashMap keyed by the regex string to hold the pattern object.

You’ll want to balance maintaining the map of patterns but not holding on to patterns beyond the useful life. And balancing that overhead with the overhead of compiling each time–which approach provides a meaningful difference in performance?

Is the compilation of the pattern a significant performance factor for your integrations? In my own integrations I hold onto the pattern for the life of the integration instance (use it multiple times in a single run) and then just let it go.

Thanks Reamon,
I think pre compiling in going to save me some CPU time on the run, I dont know how much exactly though. would hashmaps survive server restarts? I want to re use those precompiled patterns over many runs. How can I do that?

Thanks!!

A HashMap would not survive a JVM restart.

I agree that compiling and holding onto the compiled pattern would potentially save some time. I would strongly suggest determining how much time it would save, if any, before doing this optimization. Additionally, you should determine if the amount of time saved matters. Even a savings of say 10 seconds per integration (a very generous estimate) may not be useful. Shaving seconds off of the time it takes a single integration to run is usually unimportant.

How many patterns would there be? How many times would a pattern be reused? Would the cost of the hashmap lookup (or any type of lookup) offset the time saved by not compiling?

Admittedly I have very little information about your integration(s) but my initial view is that this is a suspect optimization. I’d avoid doing it until it’s determined that the integration is not fast enough.

Thanks indeed Raemon for very constructive feedback from your end.
I will try to get some actual numbers around this integration and weigh out the two options.

Just for records there are about 50 odd patterns and it would be required to run several times a day (10 to 20) as we scale up. Patterns may be added (or deleted) as we identify new patterns in the integrations.

It will need to be seen how much the cost of the hashmap lookup offsets the time saved by not compiling.

Will keep you posted.

Thanks again!!