Invalid Values in Logical Fields

Hello all!

In Natural and Adabas, a logical field (Type L) accepts two values normally:

FALSE (this is in hexadecimal H’00’)
TRUE (hexadecimal H’01’)

Due to REDIFINITIONs and other stuff, it’s possible that the field contains other values. For example: H’20’

In logical conditions Natural treats only the rightmost bit:
Example:

define data local
01 #l (L)
01 redefine #l
  02 #a1 (A1)
01 #l2 (L)
end-define
reset #a1
if #l = #l2
  write '=' #L (EM=H) #L2 (EM=H)
end-if
end

… Everything OK so far.

But: ADABAS shows a slightly different behaviour.

Try this:

define data local
01 #l (L)
01 redefine #l
02 #a1 (A1)
01 someview view of somefile
02 logical-field  /* with Null value suppression
end-define
reset #a1
r-one. read (1) someview
  logical-field := #l
  update(r-one.)
  end transaction
end-read
histogram someview for logical-field
  display logical-field (EM=H) *NUMBER
end-histogram
r-all. read someview by logical-field
end-read
write 'read-counter' *COUNTER(r-all.)
end

The output:

logical-field    NMBR
------------- -----------

01                    955
20                      2
read-counter         957

If I use a null value supressed descriptor on a logical field, I would expect to get records with logical-field = TRUE only … A really bad trap for the unwary.

Hi Matthias;

How do you have “logical-field” defined to Adabas? Since there is no real data type “logical” (unless it is in the new version of PC Adabas, have not played with data types there yet), do you have it defined as alpha? If so, I understand your concern. If not, I can understand Adabas “keeping” the value x’20’ (which is a blank for mainframers who are confused that it is not x’40’).

steve

… as B, 1.

Now I understand Adabas, too. But I never thought about that “problem” - until yesterday. Yesterday, I discovererd some bad logical values. And I know that we got many codings like this:

read SOMEFILE by LOGICAL-FLAG = TRUE
  /* do some processing
  ...
  LOGICAL-FLAG := FALSE
  update
  end transaction
end-read

… In case of H’20’ values I would do the loop, too.

Hi Matthias;

Glad it makes sense now.

I have long claimed that REDEFINE is one of the most “dangerous” statements in Natural. Your problem is just another example of this.

For some fun, try the following program

DEFINE DATA LOCAL
1 #A (A5)
1 REDEFINE #A
2 #N (N5)
END-DEFINE
*
INPUT #A
WRITE #A #A (EM=H(5))
ADD 1 TO #N
WRITE #A #A (EM=H(5))
END

Run the program and enter a “typo” value of 12E45 (instead of 12345). Take a look at what happens. Natural “strips” the high order bits from the E.

Now run the program again and enter a value of 123. Watch the two trailing blanks become trailing zeroes (before the ADD). Hardly correct arithmetic. 123 + 1 = 12301

I have long hoped for such ADDs to be illegal and produce run time errors. HOWEVER, I also realize this would degrade arithmetic performance considerably. Natural would have to test all operands before performing arithmetic; this would be VERY expensive.

Hence, the responsibility is placed on the programmer to ensure that numeric operands have numeric values. REDEFINE is the easiest way to create values that “violate” such a responsibility.

steve