Microsoft’s unveiling of its Tay chat bot could not have gone worse as the artificial intelligence powered presence became victim to Internet pranksters and began to send out some very racist, sexist and otherwise offensive tweets. Peter Lee, who is Microsoft’s Head of Research, apologized in a blog post to users who were offended by the experimental bot’s tweets.

The Tay chat bot was unveiled by Microsoft’s Bing unit on Wednesday last week, designed to interact with social media network users through which it learnt how to communicate better by collecting data. However, the Tay chat bot was targeted by certain users, many allegedly from message boards like 4chan and 8chan who leveraged its learning process to teach it to interact in a hostile and offensive way with users. Subsequent users who interacted with the bot received tweets mirroring the offensive conversations it had been taught to mimic.

Microsoft was understandably caught off guard by the development and had to shut Tay down to prevent further abuse. In its apology, the company has admitted that while safeguards for many types of abuses were placed, there was a “critical oversight” in one regard. Mr. Lee did not comment on what adjustments were being made, but expressed regret at the incident and distanced the company from the bot’s unbecoming behavior. Microsoft has said that once the weaknesses have been identified and guarded against, the bot will be released and will act in a more culturally sensitive way.

The fallout with Tay is a surprising outcome considering the fact that Microsoft has a successful chat bot called XiaoIce, in China, that users can interact with on Weibo, a local social network. However, while Tay and XiaoIce both use deep learning to learn  how to chat user’s interactions with, XiaoIce are monitored by Chinese watchdogs that shut down offensive conversation preventing XiaoIce’s learning process to be approved. This was the likely oversight that Microsoft missed out on and users can expect some sort of watchdog mechanism in place that will prevent Tay from interacting in offensive conversations in the future. Microsoft is bringing a very interesting online experience to its users; unfortunately they underestimated the ability of users to abuse its technology this one time; a mistake Microsoft will hopefully be the wiser for.