It seems that creating well-behaved chatbots isn't easy. Over a year after Microsoft's "Tay" bot went full-on racist on Twitter, its successor "Zo" is suffering a similar affliction.from Engadget RSS Feed https://www.engadget.com/2017/07/04/microsofts-zo-chatbot-picked-up-some-offensive-habits/
No comments:
Post a Comment