← back

People Long to Share Their Pain, But AI's 'Care' Is Ultimately Different from Real Human Relationships

2025-06-17

Recently I’ve reached a state where I feel uncomfortable if I don’t read, so while slacking off I opened a book on WeChat Reading called “Life Is Worth Living,” and saw a passage that was described particularly sharply and accurately:

Having worked as a psychiatrist for many years, I deeply understand that “people always want to find someone who can share their pain and sorrow.”

Fundamentally, a person’s life is actually spent living alone.

No one provides 100% help for you, and no one pays attention to you all day long.

This passage sparked some interesting discussions. A friend suggested that you could actually find an AI to chat with—as long as it has long-term memory, it could forever “care” about the little things in your life. He felt that compared to the pain of “being misunderstood,” what really breaks people down is being sick with no one to take care of them—no one to hand you a glass of water, no one to take you to the doctor when things are at their worst.

I laughed at this. I think AI’s feedback is more like a kind of “care,” but essentially it’s analysis. And it seems to have an undertone where it’s always trying to follow my train of thought to compliment me.

This makes me feel that healthy interpersonal relationships are really hard to define. They require careful nurturing—you need to give up part of your own sense of security while also taking on the responsibility of maintaining others’ sense of security. This subtlety and weight in real interactions is perhaps something that even the most advanced algorithms cannot simulate.