We have a problem encoding special characters like é, ç, à, … to an xml-file.
If we put Encoding to “true” in the recordToDocument service xml-data is created: encoding of &, , … is done correctly. However, special characters like é, ç, à, … do not appear encoded in the xml-data.
After sending over http, the characters é, ç, à, … are misformed.
Any help is highly appreciated for this urgent matter.
regards
Wouter & Claire
What is decoding the HTTP post?
How is the XML string created by recordToDocument converted to bytes?
Try adding the HTTP header Content-type=text/xml ; charset=“UTF8” to the HTTP POST. This should get the client to decode the entire message as UTF-8.
Make sure the string is converted to UTF-8 by sending it to pub.string:stringToBytes with an explicit encoding.
The destination XML parser may not implement the default encoding correctly, so you may need to add @encoding = UTF-8 to the boundNode sent to recordToDocument, so there is an explict encoding= in the XML prologue.
In both steps we tried utf-8 as encoding type, but still the character é, à, … is transformed into a very strange (unreadable) character in the XML data.