Amnesty International called Illon Musk, my mother is offensive. I never said that

Photo of author

By [email protected]


Artificial intelligence is now in two speeds.

There is operation in the fifth hardware, and the speed of the creators. People like Saman, Elon Musk and Mark Zuckerberg, who are racing to build more intelligent machines than humans. SuperIntelligence. Aji. It may be a dream. It may be the illusion of Broeng Tech. Either way, it moves quickly.

Then, there is operation in the second hardware for the rest of us. Millions quietly test what artificial intelligence can do in daily life – writing emails, summarizing documents, and translating medical tests. An increasingly, using artificial intelligence as a processor.

This is what I did recently. Although I hesitated to share personal details with Chatbots, I decided to speak to Grok, the Great Language Model from Elon Musk, Xai, about one of the most complex emotional things in my life: my relationship with my mother.

I am in the 1940s. I am a father. I live in New York. My mother lives in Yaoundé, Cameroon, about 6000 miles. However, you still want to direct each step. She wants to consult before I make important decisions. You expect the effect. When it is not kept in the ring, it cools.

I spent years trying to explain to her that I am a man who was able to make my own choices. But our conversations often end with them. Do the same with my brother.

So I opened Grok and wrote something like: my relationship with my mother is frustrating and suffocating. She wants to have an opinion on everything. When you don’t report something, it is closed emotionally.

Grook immediately responded with sympathy. Then he diagnosed the situation. Then I advised.

What surprised me first is that Grok recognized the cultural context. I picked up that I live in the United States and that my mother lives in Cameroon, where she grew up. It is a dynamic framed like this:

“In some African contexts, such as Cameroon, family obligations, and strong parents’ authority, rooted in collective and traditions where the elders guided even adult children. “

Then it contradicts this with my American life: “In the United States, priority is given to individual autonomy, which clashes with its approach, which makes its behavior feel control or abusive for you.”

There was: “offensive.” A word I have never used. Grew put it in my mouth. It was healthy, but perhaps verifying very healthy.

Unlike the human therapist, Grok never encouraged me to self -reflection. He did not ask questions. He did not challenge me. He put me as a victim. The only victim. This is where, sharply diverged from human care.

Among Grok’s suggestions were familiar therapeutic techniques:

demarcation.
Acknowledge your emotions.
Write a message to your mother (but do not send it: “Burning or tearing it safely”).

In the letter, I was encouraged to write: “I release you and harm.” As if these words will cut years of emotional tangle.

The problem was not the proposal. The tone was. I felt that Grouk was trying to keep me happy. Its goal appears to be emotional comfort, not meditation. The more you get involved with it, the more you realize: Grok is not here to challenge me. It is here to verify me.

I have seen a human therapist. Unlike Grok, they did not automatically put them as a victim. They wondered about my patterns. I challenged me to explore the reason I emotionally appear in the same place. The story is complicated.

With Grok, the narration was simple:

You are harmful.
You deserve protection.
Here’s how you feel better.

You never asked what I might be missing. He never asked how I might be part of the problem.

My experience lined up with a recent study from Stanford UniversityWhich warns that AI for mental health can “provide a false sense of comfort” while missing deep needs. The researchers have found that many artificial intelligence systems “excessive perception or indirect diagnosis”, especially when responding to users of various cultural backgrounds.

They also notice that although artificial intelligence may provide sympathy, it lacks accountability, training and moral differences of real professionals, and can enhance the biases that encourage people to stay stuck in one emotional identity: often, the victim’s identity.

So, will I use Grok again?

genuinely? Yes.

If you spend a bad day, and I want to make me someone (or something) makes me feel lonely, helps Grok. It gives a frustration structure. He puts words on feelings. It helps in carrying emotional pregnancy.

It is a digital confrontation mechanism, a type of Chatbot clutch.

But if you are looking for a shift, not just comfort? If I want the truth to relief, and accountability more than verification? Then no, Grook is not enough. A good therapist may challenge me to break the episode. Grok helps me only to stay inside.



https://gizmodo.com/app/uploads/2024/08/grok-x-twitter-image.jpg

Source link

Leave a Comment