Extremely poor performance with .NET API

Hello everyone,

I hope someone could shed some light on this.

We are running an IIS web server (on physical machine with dual P4 2.4 Ghz with HT and 2Gig RAM) that talks to Tamino (on another physical machine with single P4 2.4 Ghz with HT disabled and 2Gig RAM).

A web page will call a web service on the IIS that in turn will send off a query to Tamino via Software AG .NET API.

With a simple XQuery in the web service, if the web server is stress-load tested with just few simultaneous connections (2-5), Tamino’s inosrv.exe will consume 100% CPU time and it never goes down. Meanwhile, the web server machine is rolling at paltry 3% of CPU ressources.

I have talked to tech support, but they didn’t know what to do about it; they suggested Hyperthreading might be the issue, so I disabled it, but yet the performance didn’t increase.

So I made my own tests. When I stree-load tested Tamino with a url like so:

</pre><BR><BR>the results were very good; I could run at 100 simultaneous connections and the CPU would stay at 3-10%.<BR><BR>However, if I ran the same query, via a webservice (see code below), then the machine grinded to a halt at paltry 2-10 simultaneous connections.<BR><BR><BR><pre class="ip-ubbcode-code-pre">[WebMethod]
private System.Xml.XmlDocument GetQueryResult(string strXQuery)
SoftwareAG.Tamino.Api.TaminoConnection oCon = new SoftwareAG.Tamino.Api.TaminoConnection("http://www.hahaha.com:3202/tamino/hahaha");
SoftwareAG.Tamino.Api.TaminoCommand oCom = oCon.CreateCommand("main");
oCom.LockMode = SoftwareAG.Tamino.Api.TaminoLockMode.Shared;
SoftwareAG.Tamino.Api.TaminoQuery oQ = new SoftwareAG.Tamino.Api.TaminoQuery(strXQuery);
SoftwareAG.Tamino.Api.TaminoQueryResponse oQr = oCom.Query(oQ, 1);

System.Xml.XmlDocument xmlDoc = oQr.GetSinglePage();
return xmlDoc;	
}</pre><BR><BR>where the XQuery passed into that web service is as follows:<BR><BR><pre class="ip-ubbcode-code-pre">strXQuery = @"for $i in input()/generic-show
							where $i/@id = 'nasty2004' and $i/@active=1
									for $venues in $i/venues
									where $venues/@active=1
											for $venue in $venues/venue
											let $v := input()/venue[@id=$venue/@id]
											where $venue/@active=1
											return ($v)

Note: I have tried to change the parameters for the connection.Open method, playing with the Lock mode and LockWait and IsolationLevel etc, but it didn’t seem to help at all no matter the combination.

I now have some additional questions:

1). It may very well be that it’s the opening and closing of the connection per request that slows things down. Personally I think that if that is so, it’s extremely inneficient (remember, that even 2 simultaneous requests will bring the server to a halt).
I was wondering if putting the connection perhaps into ASP.NET’s Cache would improve things. The first request opens the connection, puts it into Cache and subsequent requests will use the already opened connection.

2) If I were to put the connection into the cache - I don’t like the solution too much, but for now it may be the workaround - there could be other problems involved with this. How would I know if eventually the connection got disconnected by Tamino? The connection object would still be alive, but the connection itself to Tamino might be closed. Also, how safe is it to use the same connection for many requests? Obviously in some scenarios I would need to open the connection more frequently, perhaps when using transactions.

3) Maybe I could put the connection object into the session. But even then, if I get too many unique visitors, I would still have a very bad performance.

4) Can someone tell me where to find more info on the different values for each parameter in connection.Open method? I mean I haven’t got a clue what the diff is between, for example, TaminoIsolationLevel.StableDocument and TaminoIsolationLevel.CommittedCommand or TaminoLockMode.Shared and TaminoLockMode.Unprotected, to name but a few.
I read all the docs for the .NET API and couldn’t find this info anywhere!

Thank you very much for helping me with this issue.

Peter Endisch

I’m a Zen Garden Maintenance Engineer

It seems odd that Tamino locks up with multiple connections. They do have stress tests on 4/8 way CPU boxes.

I’ll attempt to answer your questions.

1. I doubt that the opening/closing of connections is the problem. If this was then one would expect the performance issue to be in the program making the connections.

2. Caching the connection maybe OK. However, it is the user’s responsibility to ensure that multiple requests are not issud on the connection simultaneously. I.e. the user has to perform their own locking.

3. I’m not really sure what you are saying here.

4. The lock values should be documented in the Tamino Server documentation. In the Tamino 4.2 documentation on the main page there is a Transactions Guide that explains the lock values.

The following may possibly be related:

Are you able to run multiple instances of a console application that implements your GetQueryResult method? E.g. do a number of loops with a short timeout (~100ms)between each call.

[This message was edited by Mark Kuschnir on 07 June 2004 at 12:09.]

Hello again,

Thank you for your comments Mark.

I have tried to make a console app with my code - a good suggestion - and got some interesting results.

I basicaly took my code and put it into a loop. I experimented with the loop’s iterations and the sleep time between the iterations (I pass those as args to the console app).

I could crank up the iterations quite high and lower the sleep time all the way to 0ms and still get an awesome performance out of tamino!

Next I took the same code and put it once again into a web service. I ran my tests again and once again, flawless performance!

However, if I remove the loop and have the web service called many times (such as with a stress tool) I get extremely poor performance (my original problem) with as low as 5+ stress-level (threads).

I have tried the suggestion with the System.Net.ServicePointManager.DefaultConnectionLimit. I set it to 50, or 100 - to no avail. I have put this line in the web service method (probably a bad place) and as well into application_start in global.asax. Didn’t help.

I don’t know much at all about ServicePoint. I went and read up on it a bit. However it seems it’s useful if your code itself connects to other sites and grabs the pages responses. I am not doing that however. My webservice is the page that my other apps call (such as flash, other asp.net pages, etc).

The web service then calls
(in this line: SoftwareAG.Tamino.Api.TaminoConnection oCon = new SoftwareAG.Tamino.Api.TaminoConnection(“http://www.hahaha.com:3202/tamino/hahaha”)) another website sitting on the same IIS on the same machine, which then connects me to the Tamino’s XTS service via the ISAPI filter. That’s what’s going on.

(To recap: Two physical machines; one has IIS on it, which has two websites. One website handles all the asp.net pages and webservices, the other website handles solely connections to Tamino and listens to the port 3202 only. This other website has the isapi dll that makes the connection happen to Tamino via XTS. The other physical machine has Tamino installed on it and nothing else).

So I am stumped. Tamino grinds to a halt only if I call my webservice many times simultaneously. I don’t know if that is due to having System.Net.ServicePointManager.DefaultConnectionLimit too low, and if so, if I implemented it right.

There must be other people who have used webservices before to connect to Tamino? The webservices are extremely appealing, because it allows me to give data to other, non asp.net applications, such as flash apps, Ms Infopath, etc.

At the moment I don’t need some complicated, OO aproach to retrieve data from Tamino. A simple web service with a query would suffice.

Any further ideas? I am really high and dry with this problem.

Thank you!

Peter Endisch

I’m a Zen Garden Maintenance Engineer

Did you run multiple simultaneous copies of the console app to simulate multi threaded access?

Another stab in the dark is this snippet that I came across http://tamino.forums.softwareag.com/viewtopic.php?p=10459. This might explain high CPU utilization.

Hello Mark again,

thank you for pursuing this issue with me.

1) I agree that I should run multiple simultaneous copies of the console app instead of running a loop in the app that connects to Tamino per iteration. With the loop it’s all running sequentially, within one thread. If I run the app simultaneously, it’s many threads at once. At that is the real test to be made. If I was able to run the console app simultaneously, I would be able to see if the problem lies in the software AG’s .net API or the asp.net platform.
This test could shed more light onto the problem and pinpoint it even more. However I am not too knowleadgable as to how to test console apps; there must be some tool or profile? I have VS 2003 Professional, I dunno if any tools come with that package or if I can get some free tool, such as the one I got for stress testing web apps (free MS Stress tool).

2) Would you know how the System.Net.ServicePointManager.DefaultConnectionLimit is supposed to be implemented? Do I just set it in appliation_start like I did, and assume that it will kick in because there is some sort of “default” service point being used by asp.net any time I use Software AG’s connection object (software AG’s seems to be derived from ADO.NET)?

3) Did you see anything wrong with my setup which could cause this performance problem? Am I stress testing this correctly? I basically hit simultaneously an .asmx (webservice) (or aspx) page on my local dev IIS server, which then, using software AG’s .NET API connects to my live server IIS (once again, there are two IIS web servers on that physical machine: one for all asp.net pages on port 80, and one for tamino connections on port 3202) which then redirects the call via isapi dll in the web server and XTS to the Tamino DB that sits on another physical machine.

4) The article you pointed me to does not have, IMHO, much to do with my problem. The reason is that if it did, (the doc size matter or the complexity of the query) then if I gueried Tamino via URL directly, then I would get the same bad performance. However, as I mentioned in the first post, I ran a stress tool on the url with the xquery directly and it ran like a champ.
Also, as the article points out, I did increase the Cache for the XQuery, even though what is odd is that 1) the person is doing it via registry and not SMH, and two, the entry “XQuery_document_cache_size” doesn’t exist, it’s rather “dynamic pool size”. When I increased it from the original 512K to 30Mb, it went and consumed 8Mb right off the bat. However it didn’t affect my performance much at all. That’s just an FYI.

If anyone could try to help me out with these issues or answer my questions and / or see some flaw in my logic I would be really grateful. As it is I basically cannot query Tamino from my platform of development, .NET at all. And I can’t pass large XQueries via URL, they will get truncated since there is a limit on the size of the url params and frankly, this also destroys my bussiness logic (not to mention it’s a pain in the butt converting an XQuery to a url friendly format!)

Thank you

Peter Endisch

I’m a Zen Garden Maintenance Engineer

  1. Running multiple single threaded apps is the same as doing multi threaded access - only easier. This was to see if simultaneous multi threaded access was the problem.

    2. You should only need to set this value at the application level and it should be used by subsequent HTTP requests.

    3. Nothing obvious.

    4. I was just hoping.

    This is another stab in the dark but what happens if the aspx page talks directly to Tamino? I.e. not going through the intermediate IIS.


I got your emails and replied. As per your reply to this newsgroup,

1) I will try, as you pointed out, for at least now to open several command prompt windows and just run my console app that has the loop in it few times at once and see what happens.

4) That is a good point. I will try something like this in an aspx page:

strURI= @"http://www.hahaha.com:3202/tamino/hahaha/main?_XQuery=for+$i+in+input()/generic-show+where+$i/@id+=+'nasty2004'+and+$i/@active=1+return+<show>+{$i/@id}+{+for+$venues+in+$i/venues+where+$venues/@active=1+return+<venues>+{+for+$venue+in+$venues/venue+let+$v+:=+input()/venue[@id=$venue/@id]+where+$venue/@active=1+return+($v)+}+</venues>+}+</show>";

System.Net.WebRequest oReq = System.Net.WebRequest.Create(strURI);
System.Net.WebResponse oResp = oReq.GetResponse();
System.IO.Stream oS = oResp.GetResponseStream();
System.Xml.XmlDocument oDoc = new System.Xml.XmlDocument();

And then the oDoc would be loaded into a transform together with an XSLT stylesheet for transformation. But admittedly, for the purpose of testing, I won’t even transform the doc; the important thing here is just to query the DB simultaneously. I doubt that post processing in .NET has any effect whatsoever on Tamino.

Will let you know how my testing went today. But you will proably read my answer only tomorrow morning, due to our 5 or 6 hour difference (Montreal vs UK)

Thank you once again for all your help and talk to you tomorrow

Peter Endisch

I’m a Zen Garden Maintenance Engineer