frequently accessed data

Hi All,
we are planning to do an Integration and need suggestions for the best pratice norms with respect to performance.

we have real time data coming from SAP to our OMS system(This has a busy traffic of around 20000 transactions per day).In order to send to our OMS system we need a parameter called to port number to connect to Our OMS system.This port number is determined basing on the incoming data.basically a look up kind of thing needs to be done to determine the port number basing on the material number coming in.
The way look up is done is if the incoming material number fells in the previously determined ranges ,a specifc port number is assigned.

we thought of two options to determine the port number look up
1.putting all the material number ranges and port number in an xml file.During run time,we read the file and determine the port number basing on the incoming material number.But this reading of file has to be done for every call from SAP.

2.Putting the lookup logic in a the service will have hardcoded material number ranges and port no need of a reading a file.

I am assuming the first option is going to be an overhead as the file has to be read 20000 times daily(1 I/o per call).Even though we do a caching,the post read processing of converting file data into document and comparing the material number with ranges is going to be an overhead again.

option for putting it in a database is ruled out.

Pls respond whether option 1 or 2 is better.
Thanks for reading through.


How about the best of both worlds?
Store the information in a config file, create a service which reads that but uses service caching to cut down the number of reads… Be warned that you need to do certain things to avoid problems with caching overwriting other values in the pipeline…

You might want to have a think about at what point you do the caching, but to cut down the reads you could cache fairly close to the I/O action itself which is a readfile
So to do this, create your “retrieve config information” service that:

  • reads the file
  • turns it into whatever form you need (could be a String, structure, bytes, hashtable etc)
  • calls clearPipeline with only the service outputs in the “exclude” list

And tweak the service cache settings.
The reason you need clearPipeline rather than just doing proper pipeline cleanup is that service caching is essentially the same as a save pipeline that’s stored against the inputs to the service.

Nathan Lee

You mentioned 20000 transactions, but how many ports/material numbers are there?

How would the material number give an indication of which port to use? Is it a range, a combination of numbers, or is it random?

How about storing the data into register using We do caching this way in our project