Definition from Wiktionary, the free dictionary
Jump to: navigation, search


Yech! This is nowhere near accurate. UTF-16 is a 16-bit character encoding. A byte is an arbitrarily long stream of bits, so two bytes may or may not be adequate to encode a single Unicode codepoint in UTF-16. Even then, UTF-16 supports Unicode codepoints above the range of a 16-bit integer through the use of surrogate pairs, meaning more than two [eight bit] bytes are necessary. UCS-2 more appropriate fits the description of "two-byte" character encoding. I know it's a dictionary and not an encyclopaedia but it should at least be factual and accurate. If anything, it should mention:

- It is a variable-width text encoding - It can represent all Unicode codepoints121.218.107.39 12:15, 22 April 2009 (UTC)