Today I remembered the episode from Black Mirror called “Be Right Back”. Black mirror is in some ways considered a form of speculative design although it does have a tendency to dip into the obviously dystopian. In this episode, a woman’s boyfriend dies in a car accident. Fuelled by sadness, she uses a service that creates an AI copy of her boyfriend based on all their communications that she can talk to. After gaining an attachment to the AI, she decides to take matters a step further and purchase a like for like replica of her deceased boyfriend with the AI built in. Needless to say, things get out of hand and she eventually ends up regretting the decision.
Whilst this is a futuristic case, I doubt we will be able to order such sophisticated robotics with next day delivery any time soon, chatbots that use the messages of a deceased individual have been created, griefbots. Chatbots are becoming increasingly sophisticated and can in a number of situations replace and mimic a human. However what about using a similar strategy to help humans manage and partially automate communication? This could be done by a chatbot learning from the way their human user communicates, reacts and deals with situations (including how long they tend to take to respond to emails) and alleviate certain menial admin tasks by responding on their behalf. The precursors to this already exist, predictive texting has been around for a while, but in recent years services have started suggesting short responses to emails or message, depending on the context. Even more recently, at least with Gmail, live suggestions of how to finish sentences has emerged and us a tab-to-accept interaction, similar to how some coding platforms speeding up writing speed.
These ‘assistants’ perhaps have more of a relevant place and don’t necessarily result in the loss of jobs. However, people will still have to check over what chatbot is saying, especially if they are talking to one another, we wouldn’t want important business decisions being made with us knowing. Looking back to the episode, one can imagine installing an assistant to talk to one’s friends or boyfriend/girlfriend when we are busy but don’t want to make them feel like we don’t have to time for them, but how would we feel if we found out that the bounds we were building was with a copy of a person and not the real deal?