One of the big chunks of news to come out of the ICANN meeting in Seoul, Korea was a final timeline and implementation guideline to have internet domains in non-Latin characters. Honestly, I wasn’t even going to write about it as I am much more interested in seeing how the implementation comes about and how it shakes down. But, in poking around for news about it, I came across this Pros & Cons article. I am nearly amused by the con comments as I’d really like to know if the people making them are a) English speakers and b) monolingual. They’re just not well thought-out and such incredible straw man arguments that I would laugh if it wasn’t the case that comments like these could derail the whole process of creating a proper multilingual internet.

Expanding beyond Roman characters also increases potential for site rip-offs that use homoglyphs, characters with identical or indistinguishable shapes.

Pfft. Then we should just shut down the internet and resolutely solve the problem. I mean, people die in car accidents every year. Should we not create new cars because people could die in the new cars when they’re currently dying just fine in the old cars? This reasoning is not logical and sounds like a veiled attempt to excuse laziness in making this switch because hey, it works now, so why change?

Adding support for 100,000 international characters would make traditional keyboards insufficient input devices for accessing the entire Internet. As fellow PC World writer Jacqueline Emigh pointed out, it would be next to impossible to produce a keyboard that could support characters from every language under the sun.

Really? Are you serious? Depending on what I’m working on, I typically have up to four keyboards installed on my machine: English ISO, Spanish ISO (which also has the French characters), Croatian, and Cyrillic. I can probably type at least 1,000 different characters by easily swapping the active keyboard. I’m using Windows XP, which is old. Windows Vista and Mac OS X are even better in this department. We’ve had this “amazing” technology around for over a decade. It’s easy to switch and it works fine. And really, if I need to go to a domain that has French characters in it, wouldn’t I be probably be using a keyboard that supported the French characters already? Also, the English QWERTY keyboard was designed to have you type slower, so isn’t it about time to update it anyway?
I realize that people are shuddering to think that this could establish “language silos” on the internet. Only an English speaker would think this because currently, imagine how it is for a Russian typing with a Cyrillic keyboard to have to switch all the time to Latin characters just to enter a domain? The silos will develop no matter when and if they’re going to develop. I think that due to all the language work that’s going on these days, we are actually entering an age of far better cross-communication than ever before.
All of this doesn’t effect Sub-Saharan Africa as much as other countries due to the fact that African languages (with the exception of Amharic) were alphabetized using Latin-based alphabets. But the one thing that would be great out of this is that a language such as Lingala, which was created with accented characters, doesn’t get “Anglicized” as often when written on the internet and the characters actually stick around.
If you don’t currently have it, I recommend for anyone out there to switch to the US International Keyboard if an English speaker. It doesn’t ship as default with operating systems for some insane reason, but it offers up a huge swath of other characters to access just by using one additional key.
Out of ICANN: Đómàíñš in extended characters