Your closest friends think they know you. But a new study suggests that ChatGPT and Claude might actually understand your personality more accurately by simply reading your everyday thoughts.
Researchers at the University of Michigan gave AI systems access to people's own words — video diary entries, recorded musings, casual reflections — and asked the systems to answer personality questions the way each person would. Across 160 participants, the results were striking: the AI's assessments matched how people rated themselves, and often aligned better than ratings from friends or family members who'd known them for years.
"We were taken aback by just how strong these associations were," said Aidan Wright, the psychology professor who led the study. The gap between the AI's accuracy and older text-analysis methods was enormous. This isn't about AI getting lucky. The personality ratings the systems generated could predict real aspects of people's lives — their stress levels, social behaviors, emotional patterns, even whether they'd sought mental health treatment.
We're a new kind of news feed.
Regular news is designed to drain you. We're a non-profit built to restore you. Every story we publish is scored for impact, progress, and hope.
Start Your News DetoxWhat's happening here is less mystical than it sounds. Personality doesn't hide. It bleeds into how we speak, what we choose to say, the metaphors we reach for, the pace of our thoughts. We do this unconsciously all the time. A friend might miss it because they're living inside the relationship, caught in habit and assumption. An AI reading your raw words sees the pattern from the outside.
"Personality naturally shows up in our everyday thoughts and stories — even when we're not trying to describe ourselves," Wright explained. The difference is that generative AI can now process this signal at scale and speed in a way that older algorithms simply couldn't.
Chandra Sripada, a psychiatry professor at Michigan, notes that this validates something psychologists have long suspected: language carries deep clues about who we are. But the speed and accuracy are new. What once required hours of careful analysis by a trained clinician can now happen in seconds.
There are still open questions. The study relied on self-reported personality tests as the ground truth — so we don't yet know if AI could eventually outperform even your own self-assessment. Researchers also haven't fully mapped whether AI and humans are reading the same signals or arriving at accurate conclusions through completely different routes.
But the implications are already shifting. In clinical settings, this could mean faster screening for mental health concerns. In research, it opens new ways to study how personality shapes behavior. And in the everyday sense, it's a reminder: you're more legible than you think. Not just to people who know you, but to systems that can read between every line you write.










