supporting multiple languages

  • Hello,

    Sorry if this gets posted twice but I posted awhile ago and my post hasn't showed up yet.

    We have a need to render a website in multiple languages (17 at present,  requiring Unicode). The site is dynamically rendered.

    The expectation is that we will use a single table containing (in this case) 18 columns. 17 of these columns will be ntext, with each of these columns containing a different language (to support the actual translations of text required for presentation), and a clustered index/pk identity field. This table will be referenced by various other tables and functions.

    A typical example of this would be a tech support website- where a user can select his language of preference, and then be presented with appropriately-rendered prompts/solutions. 

    This table will be a knowledge base solution table.  It is likely that this table will contain on the order of 100,000 rows of data, with each column averaging 1KB of text. It is very unlikely that large amounts of data will be stored in any one column (i.e. a large binary file, image data, etc).

    We may need to perform occasional full text searches on this table- but this would not be typical.

    So, a few questions:

    1. Is it feasible to combine all 17 language columns in a single table? This is preferred from a development perspective, as change/revision control simplified- as is the development of joins/queries.

    2. If this makes sense, with the use of ntext, would it still be necessary to specify the specific collation for each language column?  I assume so in order to ensure proper sorting of each specific language.

    3. Does full text search support the use of multiple languages and collations within a single catalog? 

    3. Are there potential scaling issues with this solution that cannot be addressed with file groups?

    If anyone has experience with unicode and supporting multiple languages, any advice would be greatly appreciated.  If more info is needed please let know.

    Thanks,

    Greg

  • 1. Is it feasible to combine all 17 language columns in a single table? This is preferred from a development perspective, as change/revision control simplified- as is the development of joins/queries.

    This is feasible, and (I think) quite reasonable.

    2. If this makes sense, with the use of ntext, would it still be necessary to specify the specific collation for each language column?  I assume so in order to ensure proper sorting of each specific language.

    I think you've hit the nail on the head here - Unicode solves the encoding problem, not the collation (sorting) problem.  So if I were you, I would still specify column-level collation.

    3. Does full text search support the use of multiple languages and collations within a single catalog? 

    I believe so.  Both the SQL EM wizard interface for full-text indexing a column and the sp_fulltext_column procedure include language parameters, so apparently language granularity with full-text indexing is at the column level, just like it is in the database.  I am a little confused over the exact meaning of the column title in the SQL EM interface, though.  It says, "Language for Word Breaker."  There is definitely a restriction that you can only use one language per full-text function call (for example, you can't reference a Spanish column and a German column in a single CONTAINS function call).

    3. Are there potential scaling issues with this solution that cannot be addressed with file groups?

    Maybe somebody else can come up with some feedback for you here, but I think you're pretty much on the right track.

    Hope this helps!

    Chris

  • Collation might well be the deciding factor. If not for that, I'd be considering rows instead of columns. Using text columns offsets the performance issues to a large degree, your overall row size is still pretty small.

  • Thanks guys, I appreciate the feedback.  Just needed a sanity check. 

    Greg

Viewing 4 posts - 1 through 3 (of 3 total)

You must be logged in to reply to this topic. Login to reply