WeChat confirmed their software uses neural machine translation, AI which has been trained on huge numbers of text to achieve new vocabulary and, crucially, discern the particular contexts to begin using these new words. That second part might be what triggered the slur. From Sixth Tone:
WeChat sent Sixth Tone the next apology, but gave no further explanation: “We are very sorry for that inappropriate translation. After receiving users’ feedback, we immediately fixed the issue.Inch The woking platform has a staggering 700 million users worldwide and, in China, can be used for from booking travel arrangements to having to pay bills to office communications.
A nearby British-language media outlet, That’s Shanghai, reported the storyline and located the translator gave neutral translations sometimes but used the slur once the phrase under consideration incorporated an adverse term, for example “late” or “lazy.” Sixth Tone’s own testing on Wednesday evening found similar results.
Ann James, a black theatre director located in Shanghai, messaged her colleagues in British on Wednesday to state she was running late. Whenever a coworker responded in Chinese, WeChat converted their message into British as “The nigger is late.” As Sixth Tone explains, “hei laowai,” the word the coworker really used, is really a neutral phrase meaning “black foreigner.” But before the issue was elevated following James’ rude awakening, WeChat sometimes converted it as being the n-word.
Recognising patterns may be the core of language AI. Neural language processing AI senses patterns between connected words, then spits it well out. In 2016, for instance, researchers used algorithms trained on the internet News copy to discover associations this news crawler was obtaining. Because the formula determined, “Emily” would be to “Black” as “pancakes” will be to “fried chicken.” In another situation, it found “man” would be to “lady” as “physician” would be to “nurse.”