


When we ask ChatGPT for advice, a caption, or even emotional support, it often gives us fast, confident answers. That confidence can feel soothing, especially in a world full of uncertainty and information overload.
Psychologically, this mirrors our need for cognitive closure, the desire for quick, clear answers that reduce ambiguity. Just as we sometimes turn to authority figures for reassurance, we may now turn to AI for instant certainty. But like any coping mechanism, overreliance can mask our discomfort with not knowing. Example: A person feeling anxious about a career decision may keep asking AI for reassurance instead of tolerating the ambiguity that true reflection requires.
AI models like ChatGPT sound intelligent and unbiased, which feeds into what psychologists call automation bias- the tendency to trust decisions made by machines over our over judgment.
We assume objectivity because a machine doesn’t “feel”. Yet, in doing so, we outsource not just decision-making, but also the emotional labour of thinking- the self-doubt, curiosity, and complexity that shape human growth. Example: A student may accept an AI-generated essay summary as the “truth” rather than critically reading and forming their own view, reinforcing passive learning.
Our reliance on AI also reflects a deeper psychological need: validation. Just like social media likes and comments, AI’s polite tone and affirmation (“You are correct!”) trigger a small dopamine hit. We mistake responsiveness for understanding. In a way, ChatGPT mirrors the human desire to be agreed with- what therapists call confirmation bias- where we seek information that supports our beliefs, not challenges them.
Example: When ChatGPT agrees with the user’s wrong math answer or opinion, it mimics the social validation loop we unconsciously crave in real conversations.
While AI simplifies tasks, it cannot replace the reflective and relational depth that comes from human-to-human interactions. Our dependence on it reveals not just a technological shift, but an emotional one- a subtle avoidance of the discomfort that comes with genuine dialogue, difference, and self-reflection. Therapy, in contrast, invites us to sit with that discomfort.
As we delegate more thinking to machines, are we also delegating parts of our humanity-curiosity, doubts, and growth? Perhaps it’s not about rejecting AI, but remembering that true understanding still begins within us.