Microsoft's "Zo" chatbot picked up some offensive habits
It seems that creating well-behaved chatbots isn't easy. Over a year after Microsoft's "Tay" bot went full-on racist on Twitter, its successor "Zo" is suffering a similar affliction.
from Engadget RSS Feed http://ift.tt/2tdcsT6
via IFTTT
No comments:
Post a Comment