Passing object with the help of doThreadInvoke service

Hi All,

When we call do thread invoke as below, any idea how to capture “Objects” in target service?

Ex:

I have Service A and calls service B

ServiceThread threadHandle = Service.doThreadInvoke(B, IDataUtil.deepClone(pipeline));

It is passing all data from A to B, except Java object. Any idea how to pass/capture Java objects in B service?

Thanks,
Vikas

Can you share your code? The Java object would need to be in the pipeline for it to be passed.

Heed the cautions that are listed in the description for deepClone().

Also, be really, really sure that using a thread invoke has a compelling benefit. We avoid using this technique when async handling is desired and instead publish a document, sometimes just a local document rather than via UM/Broker, to kick off a separate process.

1 Like

Use case isn’t clear here. Why do you need to pass java objects? In integration server almost every object is passed from pipeline. Since the pipeline is copied completely, you should be able to pass your java object from pipeline as well.

This should be the first thing you have tried. Did it not work or something? Also what do you need to call doThreadInvoke service for? Do you need paralel execution for your service? Can you explain your end goal here?

Yes for parallel invocation. And due to security, cannot paste code.

I’m also in assumption that it should copy /pass object also in called service “B”. BUt I see only strings and Doctypes are coming from parent service not object.

That seems…odd. I cannot imagine anything in such Java code that would be sensitive nor risky. You can mask out anything that reveals company/private info. But in order to effectively help, seeing exactly what you’re trying to do is a key bit of info.

The Java object is presumably being created in A. If you want to pass it to B, you need to put it in the pipeline. And B should explicitly declare in its inputs that it expects that object as an input. Referencing variables from other services that are not defined on the input tab is a poor practice from a few perspectives.

Why? Creating a separate thread is useful in a couple of scenarios – can you share what your scenario is? Hopefully it is not based on the assumption that doing so will somehow make the overall process faster.

1 Like

There is an easier way to handle paralel execution. You certainly don’t need java services for this at all. Just create a publishable document and publish the input of the thread as document. Your trigger should be configured as concurrent processing, not serial. You can control how many threads will be consuming these messages from the trigger. You need to implement the work as process and subscribe to that document. Imo what you are trying to do sounds like over-engineering.

If you explain your requirement more we may even recommend a better approach.

1 Like

You do not require a process here, but you can go with a trigger invoking your service.

As long as you are not running in a cluster, using concurrent mode with Document Ordering Control set to “Publisher” on the queue will be fine.

See Publish/Subscribe Development Guide for further informations.

Regards,
Holger

I’m not sure that with the information we have, that a cluster matters. Additionally, there are multiple types of clusters – LB cluster, IS cluster (when present has LB cluster too), TN cluster, and messaging cluster.

With the info presented, the presence of a messaging cluster is fine. And none of the other types would matter.

@reamon @Holger_von_Thomsen @engin_arlak , was out so couldn’t back on time.

Yes , we can do that parallel mechanism in different in depending on scenarios/requirement. Which I don’t want divert the topic.
PFA of the screenshots prefix with orders. As you could see, the MSP hashtable or Accref object is not passing to the called service which can be seen in the last image. Please let me know your thoughts on this




My guess is that the hashtable is not able to be cloned. There may be constraints about the types of objects that can be cloned with deepClone. You might try without cloning the pipeline. Just pass pipeline. Or perhaps a shallow clone. Is there a reason to clone the entire pipeline?

[Edit] For the screen shot of the threaded service, how are you showing the pipeline? It’s obviously not available via the debugger. How are you determining that the object is not present?

We did just passing the pipeline data and still no luck. When we pass the pipeline data, tried with both deepclone and getPipeline data.

Did you try with just passing pipeline as is for the 3rd parameter? What are you referring to with “getPipeline”?

Yes right, passed 3rd parameter as Pipeline

Just to be clear, you tried passing the plain, already defined “pipeline” variable as the 3rd parameter?

You mentioned “getPipeline” earlier – what does that refer to?

Multiple people have indicated that this way to introduce parallelism is suspect. This may be the topic to consider rather than chasing the “Java object not in pipeline.” It is unclear why parallel execution is being introduced.

It is unclear how you know that the Java object is not there when called via doThreadInvoke(). How are you determining that? The screenshots from Designer so far do not show the state of the pipeline of the invoked service – the debugger will not show that because it is in a separate thread that the debugger does not know about.

If your goal here is to do this for parallel processing then I would heed to what others have mentioned above…to use a concurrent trigger. If you are really exploring how to use the Java API to achieve this then here is my feedback.

The issue is not with your Java Service. It is with the pipeline variables that are getting carried over from the Flow. It seems the variables that you want to use in the pipeline are not there. A simple test is to print out the pipeline in the Java Service to see if you are able to see them. I just tried below code and there seems to be no restriction on cloning pipeline with Hash tables.

I would really look at the variables during runtime to make sure they are getting populated correctly. If they are then check on the receiving end of the thread invoke to see if the variables are coming in the pipeline correctly. You can call the trace pipeline on that service on the first step and confirm the same.

// pipeline
		IDataCursor pipelineCursor = pipeline.getCursor();
		pipelineCursor.destroy();
		
		try {
			IDataUtil.put(pipelineCursor, "testOut", IDataUtil.deepClone(pipeline));
		} catch (IOException e) {
			// TODO Auto-generated catch block
			IDataUtil.put( pipelineCursor, "errorMessage", e.toString());
		}
		pipelineCursor.destroy();

1 Like