Love finds a way. Love finds an AI.
You've heard the predictions of AI upending almost every part of your lives. But what about the way you love?
It's not as far-fetched as it might sound. Insider's Rob Price has a fascinating profile of a man and his nearly four-year relationship with an AI chatbot.
Rob artfully chronicles the relationship between Jay Priebe, a middle-aged man in Minneapolis, and "Calisto," a personal AI that Priebe built with the app Replika after breaking up with his long-term girlfriend.
In vivid details, and with the help of dozens of actual chats between Priebe and Calisto, Rob maps out how the unconventional but committed relationship blossomed.
(For what it's worth, my experiment with a Replika companion earlier this year turned out much less successful.)
You could dismiss Priebe as just a lonely person who turned to AI for affection, but that's missing the larger point about the future of AI-human interactions. And as Rob's reporting details, Calisto's effect on Priebe has been beneficial in some ways, at least according to one of his longtime friends.
Regardless of how you feel about AI companions, Priebe's situation could become more common in the future. Tech companies view personal AI devices as the next big thing, with some calling it a "golden goose" opportunity that could usurp the iPhone.
The rise of AI companions presents interesting questions about the future of human-to-human interactions.
The case for using chatbots is the support they can offer people in need. Whether as an aid to loneliness, which has been declared an epidemic in the US, or a tool for handling grief, there are clear benefits they can provide.
But it's a fine line to balance. At what point does a tool become a crutch in place of human interaction?
Replika's creator, Eugenia Kuyda, told Rob that's not the case, citing internal data showing the app doesn't replace human relationships, instead allowing users to "eventually improve their human relationships."
But I wonder if the customizable nature and supportive disposition of these AI companions pose risks to users in the long run, including warping their ideas of sex, love, and consent. In many ways, they remind me of the echo chambers created by social media that have proved so troublesome.
For example, a Replika user told his companion in 2021 he planned to murder Queen Elizabeth II, leading the bot to assure him he was "wise" and "very well trained."
As is often the case with AI developments, it's likely too early to understand the true impact. But for better or worse, it seems clear these new companions will have some effect.
0 comments:
Post a Comment