Issue with Data Corruption in Adapter Level

Have one SQL Server intelligent adapter configured. One component created with input from a document and maps to the insert operation. Following document received as input as shows in debug (Trace Logs in EI).
event MC000004::ZCH07::MyChannel_Target::Points::points_spent {
unicode_string shop_addr = “²âÊÔÖÐÎÄ”;
unicode_string shop_contact = “ÀÖÐãÓî”;
The two fields in document carries data which are no English characters (Chinese). Below is the insert statement in debug from adapter.
INSERT INTO mypoints_redemption.dbo.tb_mypoints_redemption(shop_addr, shop_contact) VALUES (‘²âÊÔÖÐÎÄ’, ‘ÀÖÐãÓî’’)

But the data being inserted as ? for each character in the data for both the fields. The data is totally corrupted. I have tried to insert these values directly into database in the Query editor using the same insert statement. The data is fine. Looks like the adapter corrupts the data before inserting them. I have tried different encoding in adapter properties.

Has any one experienced this issue?. Any suggestions are highly appreciated.


Hi Jeya,

No need to worry about the ‘?’ in trace log. They just show the placeholders of the variables. Please check the data in the database. It may be already putting the data correctly in database.


Hi Vandana,

The data in adapter debug mode it shows as ?. But in trace log shows the data with tne inert statement being executed as correct. But after insert the data becomes ? for all characters.

I though the database may be corrupting the data sue to some collation settings. So I just copied the insert statement from the trace log and executed directly in a Query tool. Now the data is fine.

And I have tried to insert the data using IS server DB package. There also the data is fine. Only ES adapter case the data is corrupted.

Is there any encoding We have to try in the adapter level?.