Export data to a text file with the preservation of the encoding

Hi!

I have a table with alphanumeric data in Russian language.

After exporting this data into text file, using WRITE WORK FILE command, character data is displayed normally. But after looking at the file, I found that the Russian characters in the file are replaced with similar English. I think the problem is that the data is uploaded in the ANSI character set.
This makes it impossible to search the character data upon request.

Are there any workarounds?

Thanks in advance!

By default, you get an ASCII file. Try changing the file name to have an extension of “.sag”, then re-run your extract.

If that doesn’t work, add the following line to the beginning of your extract program:

DEFINE WORK FILE nn TYPE "SAG"
1 Like

Thank’s Ralph!

The problem is not that.

The data is already stored in the database is incorrect. This is a mistake the previous programmer.
I could use EXAMINE TRANSLATE, but the data have English names of firms. This will lead to the wrong data.
The only option - manually modify the fields where there are English names.

Best Regards,

First you have to find an answer to the following questions:

  • If you’re using a terminal emulation or something like that: What’s character set there?
  • And if you’re using some kind of Unicode Translation: What character sets are you using on the Natural-Client?
    Both questions must be answered for
  • the users storing the data into the database.
  • the users reading the data.

The worst thing is if you have different character sets on Input-Side and if the input-values are stored into an A-type-Field without any translation.

And be aware: There is no “English” character set. There are: ANSI, Windows CP-, ISO-8859-, UTF-* … That makes this work really weird…

1 Like

First, the long term goal is clearly to end up with just russian characters and no english characters.

If the existing file is relatively small it may be best to simply re map the bad data.

However, if you are dealing with a large file, you might want to consider something along the following lines. I presume the search criteria you mention is in Russian. Can you simply run the search criteria through the same process as the data goes through? This would yield english characters as your new search criteria.

1 Like

Hi Steve.

Maybe I do not fully understand.

The database of large data sets.

I have no problems with the export of tables that contain only the Russian data. In this case, I use the EXAMINE TRANSLATE.

But there is a table in which the fields are Russian and English data simultaneously. In this case, the EXAMINE TRANSLATE not fit. Because then the British data will be converted too.

The task is to upload data to a text file from which then import the data into another database.

I have thoroughly studied the loading of data into the database. Maybe you’re right and possibly invert the load in the unloading.

Many thanks to all for your help!