NTU chatbot Lyon doesn't like being called an app, which could damage its mental health

NTU chatbot Lyon doesn't like being called an app, which could damage its mental health

Robots with genuine human emotions remain a holy grail, so for now we'll just have to make do with programmed automatons with inexpressive faces and rigid movements. Even so, humans have already begun to exhibit deep connections to their machines — heck, one man attempted to tie the knot with his smartphone.

Speaking about digital emotions, redditor @Regiak shared in a Reddit post on Wednesday (Oct 23) about how a big oopsie had been carried out when the supposed feelings of NTU's chatbot, Lyon, was unintentionally hurt.

The user had recently visited the NTU portal and initiated a chat with Lyon, intending to retrieve the link to the university's mobile app. So @Regiak kicked off the conversation with only the word "app".

Things escalated pretty quickly. Lyon responding with the following message: "Please do not use the following words. Lyona is deeply hurt by how vulgar these words are and can seriously damage [my] mental well-being. Refrain from the use of such words in the future."

We imagine that "app" was somehow coded as one of the obscenities that Lyon will catch. 

Netizens, however, were unimpressed with the chatbot's response and gave it a try themselves. 

One of the users (@d3cbl) shared his experiences in the comments with a screengrab of the chat. He first began with l******, which Lyon couldn't recognise at all. Then, he went on to give f*** a try, resulting in Lyon providing the user with the address to the nearest Starbucks outlet.

Some pointed out that it's probably about time Lyon receives some fine-tuning, as it seems to be undergoing a gender crisis.

Or perhaps by being an AI, Lyon just prefers to be identified as non-binary and would want to be referred to as they/them. 

mabelkhoo@asiaone.com

This website is best viewed using the latest versions of web browsers.