Volume problem? System hanging

I have loaded a repository with ± 2000 Natural modules. We have a database field that needs to be changed, so I am now trying to do the impact analysis for this change. It hung after 1/2 an hour on module 130 - this is a program that uses the file, but is not referencing the field at all. The times spent and estimated time did not move (The estimate was over 500 minutes!). However I seemed to be using 100% of the CPU. It also seems to hang (using 100% of the CPU) if the ddm list is to be populated, and also some other reports.

I have set up the database parameters as specified in the ‘Environment sizing’. Should anything be different? Is this too big a repository? The HOSPITAL system worked fine.

From my XREF data I can tell that only 128 modules are potentially affected. I am wondering if I shouldn’t just extract these 128 modules, plus all the ddms and copycode in the library and work on this smaller repository? I know that there will be lots of missing modules but does this matter as long as the field to be changed isn’t passed to any other module?

Do the objects need to be catalogued in the library for Natural Engineer to work?

Priscilla Fuller

Natural 611 PL 13
Natural Engineer 521
Adabas 333 PL 1
Windows XP - Pentium 4 2GHz 1gig RAM


Shouldn’t be a problem with the volume. Which Impact Criteria are you using? You could use DBFILE or even Multi-Search.
If you are running with CONSISTENCY or Multiple Iteration, then it may take some time on a single object due to locating all the derived fields.

Else, it may be looping, in which case you will need to raise a support request and they will require a source code sample to recreate the problem.

BTW, NEE 5.2.1 SP1 has been released. I am not aware of any changes that would affect this area though.


I have 2 criteria defined in version 1 - DBFILE, Keyword value EXC-WAYBILL, Search value TP-NO, Replace Defn A15, Cons Type Y

This field is in a superdescriptor so we want to preserve the existing one until these changes go live. My second criteria is: DBFILE Keyword Value EXC-WAYBILL, Search value SP-TP-CO-WAY1, Replace Value SP-TP-CO-NO, Cons type Y. I assumed that NEE would figure out the new size?

I’m sure it was looping because the times stopped changing

Thanks for the quick reply :slight_smile:


Can you try it as 2 separate versions - does this work? Basically, it will execute the consistency twice with the definitions you are currently using.

Alternatively, try out Multi-Search which works on field names. When using this, run it in single mode first to examine the seed list that would be used if you ran it in Multiple iteration. Ok, there is no modification for multi-search (except for field length changes) but you would find all the impacts relatively quickly.


Thanks - I will try this on Monday - I don’t really want to leave it running all weekend and I am going home now :smiley:

:oops: Seems I was a bit hasty thinking the job was in a loop - I started it this morning, thought it was again in a loop but left it while I did something else. When I looked again, the object count, time spent and estimate had all moved. So the moral of the story is - don’t be impatient.