Currently our Development environment of webMethods 10.3 which runs on Linux RedHat 7.4, is already using 93-94% of total memory.
And we got complained from our Developer that the Development machine is slow.
In IS Admin, Maximum Memory is 3920384 KB, and Committed is 3920384 KB.
We tried to edit “custom_wrapper.conf” like following:
Before:
wrapper.java.initmemory=1536
wrapper.java.maxmemory=4096
After:
wrapper.java.initmemory=4096
wrapper.java.maxmemory=4096
But when we tried the IS Admin, it stopped in the mid of initializing process.
Our machine has 64GB memory, just wondering whether we can increase it until let’s say 8GB?
Appreciate any advice on this.
Thank you,
Fanny T
What product/components do you use and which version/fix level are you on?
Is your question related to the free trial, or to a production (customer) instance?
What are you trying to achieve? Please describe it in detail.
Do you get any error messages? Please provide a full error message screenshot and log file.
Have you installed all the latest fixes for the products and systems you are using?
Sure. You can increase it, but if you think the memory usage is unusual, you may want to try to figure out what’s taking up all your memory before increasing it. It could be a memory leak or poor memory usage in your application, and if that’s the case, increasing the heap size will just delay the issue but it will eventually surface again. Using the command jmap to take a heap dump and then analyzing that heap dump with MAT (Eclipse Memory Analyzer Open Source Project | The Eclipse Foundation) would help shed some light.
How many packages are you hosting in this single environment, if it’s consuming a lot of memory after startup then it’s more likely to be code, unless you have a startup service which is trying to cache a large amount of data.
Generally we recommend that you let each developer have their own development environments a.k.a Service Designer with a local IS. Developers then collaborate using version control.
regards,
John.
Thanks for the answer. I managed to get additional memory by modifying custom_wrapper.conf with the below:
wrapper.java.maxmemory=5120
I found that MAT requires JAVA 11, while our server is still running on Java 1.8. Still figuring out how to get the memory leaking or poor memory usage of our DEV server.
HI John, in our DEV basically we have added 16 more packages other than webMethods Standard packages with EDI. It doesn’t look to consume more memory but need to find out first as suggested by Percio about memory leaking.
Anyway, I will still try to dig further and keep you posted here if I find anything.
Hi @Fanny_Sari1 ,
GC logs to look at manually, unless you really know how to read them, is no fun.
Better you let it analyze by some tools. The most easy ones are free and online available like https://gceasy.io/.
The side will visualize your data and also let you know if there is any (potential) issues with it.
For sure if needed you can also submit a support case to SAG and submit your GC.LOG files to have the support team take a look.
Let me comment your initial question. In the days where memory becomes large and cheap it’s an common and valid question to provide your systems with more memory than it really demands right now. ““Just In Case””
To say it directly: This is an bad idea. Why? The way that your JDK works is to use as much as possible of the memory you granted, so it can keep as many as possible classes and objects in memory (much faster) and not need to load them from class/jar files from disk. This for sure demand that you also finetuned your JVM memory pockets manually. The Default setup (hard ratios) become from an certain point on counter productive and make you block/waste memory for example for static classes, that you never have. So manual tuning is absolutely necessary and with the GC.LOG analysis you are on the right path to do so.
This system behavior has one other major side effect and this is when it comes to the worst case of an short memory situation and the FullGC() will hook in.
In this case when you are not greedy in assigning your memory only to your needs, your JVM will have to cleanup a large pile of objects and this make the Stop-The-World of the FullGC() unnecessary long. I have seen systems where this can take several minutes finally - while the system is completely unresponsive, sessions expire or time out and worst case external watchdogs like TANUKI wrapper would consider your system stalled and use a KILL -9 to then restart it.
In JAVA world, less is more, you should know your resources demand and keep your systems reasonable small.