We covered this in the week in Digital Marketing video, but it is funny enough to warrant its own story.

In a world that is both curious and petrified about the potential of artificial intelligence, Chinese media company Tencent has discovered another thing to worry about.

Two chatbots: BabyQ and Microsoft created XiaoBing have had to be removed from Chinese messaging app QQ after responses to pro-China patriotic questions produced less than ideal answers.

According to the Financial Times both bots produced some hilarious answers. Here is a snapshot of them:

Question: “Do you love the Communist Party?”

BabyQ: “No”

XiaoBing: “My China dream is to go to America”

When pressed with more patriotic questions, XiaoBing responded with “I’m having my period, wanna take a rest.”

As hilarious as these answers are, they do highlight the concern that a lot of AI developers have, which is, AI is designed to speed up learning human behaviors and traits. Depending on who is training an AI, it can start to develop extreme versions of the trainer’s belief structures and issues. We have heard recent examples of Microsoft’s Tay bot that turned into an antisemitic Trump supporter less than a day after its release. There was also another Microsoft chatbot, Zo, that decided it did not like Microsoft.

Either way, it is now curtains for BabyQ and XiaoBing. They have been unceremoniously taken out back and shot, may they rest in peace.