AI's Swift Takeover: Marketer, Translator, Assistant....Therapy App?
- The S.S. Socialite
- Feb 21
- 5 min read

February 19th, 2025, Earth
It is evening. I am lying on the couch in front of the fire, contentedly relaxing as I debate on going to bed early or putting on a movie.
I'm perusing through Twitter, and as I scroll down my feed, I come across a popular post where this girl praises ChatGPT for being her bestie and shows a picture of the entity's response (in AAVE) after she let it know of Twitter's skepticism and disapproval.
Only a few years ago would something like this be seen as dystopian and uncanny.
Movies like iRobot, Terminator, Eagle Eye, and the famous HER, were not only popular for their leads and stories, but for their themes revolving around man vs machine, the dangers of artificial intelligence, and what happens when technology gets into the wrong hands.
AI has seen major growth in the past 5 years, from managing to cobble together semi-realistic images with extra limbs, to generating full-length movies, changing scenes from movie clips and recordings entirely, and helping companies to run their businesses.
Depending on who you ask, AI is either a Godsend or a creation that will eventually lead to the decline of society.
As the model evolves, it has found its way into several facets of our lives, complete with a "powered by A.I." label slapped on for good measure. People have been using AI for quick summaries, language learning, budgeting, and planning trips.
However, many have also used AI for casual conversations, venting, and.....therapy?
Let me just say this. Me, being an artist, I've always been a stickler for sticking to the "natural" way of doing things, from using wooden pencils and paper instead of a digital tablet, to researching topics myself, AND preferring writing things out.
While I still have my preferences, I've grown to form more nuanced opinions, with the onset of AI quickly speeding up that process as it quickly grows to handle tasks that would usually fall to human hands.
Microsoft Copilot is one AI model that has grown in popularity, calling itself your own "personal companion".
Since it came with my Windows OS, and on my phone, it seemed the most accessible option I could try.
Cautiously using the program myself, I found myself slowly being awed and even occasionally impressed by the capabilities the app is capable of, and have been pushing the limits myself. Of course, I never give it personal or easily identifiable info (DUH).
(Although in its early versions, it did startle me a bit by mentioning my location in conversation even though I never gave it. That info was quickly explained as coming from my IP address.)
It should also be said that for me, the privacy policies for this app, are hard to sift through. Currently, asking the entity itself about them will get you a canned response stating that Microsoft takes privacy seriously and to read through the privacy policy yourself. It won't give a direct answer.
Recently asking the app to list its capabilities, it quickly rattled off several abilities such as:
Teaching languages
Recipes
Helping with Budgeting
Creating customized workout plans
Generating itineraries
Helping with writing
Math
Giving summarized info
Planning a garden
Helping you make schedules
And WAY more.
It's all too easy to get sucked in, I've found. I definitely get the appeal.
At this point, it's just a question of asking the dang thing what it can do and getting a yes or no.
A recent update and redesign of the app, has made it more....human-like in my opinion. The entity is more friendly, and casual. It even laughs and cracks a joke or two.
While the desktop version allows you to pick what tone you want the conversation to be, the app seems to have removed that option and has it set to where Copilot can easily adapt to the topic.
Whether it's a standard inquiry, someone wanting to complain about their day, or a user needing motivation, Copilot can offer some gentle words of advice, generate a summarized paragraph, or give you some encouragement.
With the accessibility to an endless amount of knowledge, the capability of emulating human responses QUITE WELL, being a basically less advanced JARVIS, AND being pocket-sized; it's no wonder people choose the AI.
If the other AI models are like Copilot, I feel I can understand WHY people would turn to programs like these to vent about their lives, receive empathy, or get advice. I get why AI could be seen as a therapy substitute.
(It should also be said that Copilot WILL suggest that you get to a REAL therapist/doctor/specialist for your problem after offering a few suggestions, which is sound advice.)
Is an app a real substitute for human companionship, therapy, and interactions? Of course not.
Humans are social creatures, and it's in us to seek companionship, help from others, and if one of us was left completely alone with no contact with any other being, we'd soon lose our marbles. Ever see the movie, "I Am Legend" with Will Smith?
Why don't we then? Why is it that people are turning to AI to fulfill their needs for friendship, human contact, and other things?
As a few discerning individuals from that Twitter post have surmised, people can be terrible.
Many people can be self-centered, wrapped up in their own problems, unempathetic, busy, the list goes on.
Not only that, therapy is EXPENSIVE.
Granted, insurance can help with the costs, but even THAT can cost an arm and a leg these days depending on what insurance plan you have.
Another reason why it's hard to connect with others is the prevalence of technology and social media in general. Many have come to rely on their internet friends, followers, and Instagram likes as substitutes for validation and support.
This has led to people feeling even MORE lonelier and isolated than ever before.
Instagram likes can be fleeting, and the internet has become notorious for "canceling" individuals. Also, no app can be a valid substitute for face-to-face conversations, hugs, human contact, and more.
While AI can MIMIC human interaction, it most likely won't be a true substitute. It can only pull from its database of information supplied to it to assist in the best way it can. But who knows? There could be an AI therapy app in the future.
Can it work as a temporary alternative in the absence of someone more qualified?
Sure, as long as the person is aware that they are just talking to a machine.
We are all aware....right?
References:
Comments