TN Data Analysis

I have about 30K records for 200 partners on 50 documents in TN for a day.

Requirement is to analyze and report on these documents.

Is it a good idea to query on these tables and process each record ( select 10 fields which are needed for analysis) and dump them into a staging table ?from where my web team can create reports.

Any Idea’s?

Thanks!

Do you have to report any custom attributes/fields a part from TN transaction analysis fields?

yes query the TN db…what you can do is just select what ever criteria you want to report based on the doc type/partner id’s and Transactions tab (TNC)–>view SQL for that selection criteria and then customize that sql query via Adapter Service per your reporting needs and schedule this wrapper that calls the above AS.

HTH,
RMG

Yeah I do have the custom fields/attributes and I have built the query and is running fine except for its takes long runtimes.

But the question is it a good idea to select all records( 30 k or so) and bring into the memory all @ once or shud I do multiple queries (on partners/docs this wud be quite a few if I go by partnersand also will make the runtime for too long as it quering the tables multiple times). And also when I insert into the staging table ,can I insert all of them as batch insert.

More looking into the best practice scenerio.

Appreciate your time.

Thanks!

No its not a good idea to keep all in memory and then query/report:

Yes you can do batch insert SQL:

One option to consider would be to replicate the TN DB tables and then query the replicated tables using reporting tools.