Work Files at debugging

I’ve tried to debug a Natural Program that handles work files. So I got the message:

- [my logon] 2160 WORK file 1 not available.

I have copies of the files in my PC. And of course these files are in the Mainframe server.

Is there a way to properly deal with work files at debugging?

Are you saying that the program fails in the Debugger, but that you can run it successfully in an on-line ONE session?

ONE is a development environment, and although it has a built-in runtime component, It is not intended to run batch processes, that is, those with printer and/or work files. You are expected to run a batch script for such batch processes.

As a quick fix, add a DEFINE WORK FILE statement to the beginning of the program, such as

DEFINE WORK FILE 01 'c:\full-qualified\file-name.txt'
2 Likes

Another quick fix, which is not really practical if you require hundreds of records for your debugging.
Simply place a small number of records on the Stack and “pop” them with an INPUT statement. Put the INPUT within a REPEAT statement. Run the program online without any work file access.

2 Likes

Yes, it’s a batch process.
In order to use the debug resources, I did like this:
(I’m a noob at this. That’s an empirical approach - I’m Just trying to understand the possibilites)

DEFINE DATA LOCAL
	1 DEBUG-MODE (L) INIT <TRUE> /* change to false when it's ok to run for real

	1 INPUT-REC /* INIT VALUES FOR USE IN DEBUG
		2 FIELD-1 (A4) INIT <'exp1'>
		2 FILED-2 (A4) INIT <'exp2'>
		2 FILED-3 (A4) INIT <'exp3'>
END-DEFINE

IF DEBUG-MODE	
	PRINT INPUT-REC
	/* USES INIT VALUES FOR DEBUGGING
ELSE
	READ WORK FILE 1 INPUT-REC
	PRINT INPUT-REC
	/* USES VALUES FROM THE FILE
END-IF

/* ... REALIZES THE REQUIREMENT

END

It runs ok, but for larger programs this might not be so practical.

I didn’t know this ‘DEFINE WORK FILE’, but now I’m curious for this - It seems much better tham my approach.

Great suggestion! I’m now curious about this too - it seems very useful, when we know how to do it.
I don’t know about Stack, I’d need to study this and get some examples.

I put the ‘DEFINE WORK FILE 1’ right after the ‘DEFINE DATA LOCAL … END-DEFINE’ .

Before I start debugging, I’m doing update (stowing) on the program.
In fact I’m doing remote debugging, right?
So it still complains about the missing file.

Any guidance on how to do local debugging?

Hi Tomas,

You should really use the Debug Attach Server (DAS) for debugging mainframe batch Natural. It takes a little bit of set up, but once you get started and get used to doing a couple of setup steps each time you start a Debug session, it is much easier than having to allocation workfiles to a Natural TSO session and use the mainframe Natural Debugger, or copy work files to a PC every time you want to debug a batch job using NaturalONE.

You simply start a natdas.exe (Debug Attach Server/DAS) running on your local PC…Then you add some DAS connection information to your batch Natural job step(s) so that the batch job can connect to the DAS running on your local PC, and you also connect your NaturalONE session to the same DAS session and set some breakpoints in your code. Then, when you submit your batch job, the NaturalONE Debug perspective automatically starts when the first breakpoint is reached and you should be off and running for normal debugging.

We have 40+ programmers using NaturalONE and just recently (end of April, 2020), I distributed some standard procedures for our programmers to use the DAS for debugging both batch and online Mainframe Natural code. They were all used to code editing in 3270 mainframe terminal emulator sessions just a year ago, and have all transitioned to performing all Natural code editing in NaturalONE, and those who have since debugged code using the DAS really like it’s capabilities of being able to see the relevant data field values side by side in the same workspace as the source lines they are stepping through.

For the programmers in my shop, I established a special SYSPARM and setup process to make the mainframe batch and online connections to the Debug Attach Server (DAS) easier. If you are interested, I have a version of both the Debug Attach Server usage procedures and the procedures for the setup process to make the mainframe batch and online connections to the DAS easier, that I can share with the SAG community to help more SAG customers get going using NaturalONE so we can see more NaturalONE dialog in both the Tech forums and in SAG-L.

If interested in these procedures, let me know where to email them and I’ll email them as soon as I can.

3 Likes

For me this is an advanced resource, Kate.
As I mentioned before I’m a noob, but I’m willing to learn more about this.
I appreciate your contribuition. Thank you!
Could you send the material to : supertom.fm@gmail.com , please?

[image]

We had a way of using ‘mimic’ in the past in NaturalONE - i’ll dig it out.

Basically you just set the server mapping runtime with some profile parms and the workfiles magically link up.

And if you also have the IBM File Explorer Z/Os plugin installed, makes life SO simple, as you can access everything in one place…

I’ll create a page on my blog on how to do it and link it back here.

Cheers

1 Like

@Kathy_Jackson1 I thank you again for your detailed explanation. I’m still working on it. If you could email me like you said, I appreciate it. supertom.fm@gmail.com

I am attaching an Inside Natural edition which discusses the Stack.

IN Vol 15 No 3.pdf (190.6 KB)

1 Like