The Daily Pennsylvanian is a student-run nonprofit.

Please support us by disabling your ad blocker on our site.

A study by the Penn Positive Psychology Center showed how Chat GPT-4 yielded therapeutic benefits through generation of personal narratives. Credit: Abhiram Juvvadi

A study conducted by the Positive Psychology Center explores how personal narratives generated by Chat GPT-4 can have therapeutic benefits.

The project, which successfully completed its first two phases, uses large language models to craft personal narratives, unlocking potential applications in both self-exploration and therapeutic treatment. In the study’s third stage, Abigail Blyler, the research manager, seeks further to assess benefits to clients, therapists, and coaches.

Collaborating with Martin Seligman at the Positive Psychology Center, Blyler's model crafts personalized narratives derived from a compilation of stream-of-consciousness thoughts. With outcomes indicating a high degree of accuracy in participants' perceptions of the language model's responses, this approach heralds the potential for AI to augment and streamline therapists' work.

According to Blyler, the inspiration for the study was based upon a theory by Tim Beck — founder of the Center for Cognitive Therapy at the Perelman School of Medicine — which linked automatic, negative thoughts to depression.

“We didn't actually intend to find anything. We're just kind of exploring and seeing what might come about, though we did have a hunch,” Blyler said.

Participants were asked to tap into their inner monologues in the initial phase by documenting 50 automatic thoughts over 48 hours. These insights, coupled with selected demographic information, were then inputted into Chat GPT-4. 

With these inputs, Chat GPT-4 produced personalized narratives for all 26 participants. Ninety-six percent rated the narratives as "completely accurate" or "mostly accurate," Blyler said.

Nearly 96% of participants indicated that they learned something surprising about themselves through these narratives, suggesting the potential application of this approach in coaching and therapy.

In the study's second phase, Blyler delved into the practical application of these narratives within therapeutic contexts. Researchers directed Chat GPT-4 to function as a virtual coach, relying on five personal narratives produced during phase one to form a detailed treatment outline and a tailored treatment roadmap.

In future iterations of the study, Blyler hopes to enable therapists and clients to focus on their core objectives and goals without the burden of repetitive storytelling sessions. With the success of the initial results, Blyler plans to collaborate with a coaching company to assess her research further.

Blyler's vision revolves around enhancing clients' well-being and streamlining the therapeutic process by using AI as a complementary resource for therapists and coaches rather than a substitute.

“I think it's worth addressing the elephant in the room,” Blyler said. “There's a lot of fear and concern around AI – job replacement to privacy and ethical concerns.” 

Through her studies, Blyler has observed that AI — even on a small scale — can challenge ingrained self-perceptions. It offers fresh perspectives by presenting individuals with narratives highlighting their positive attributes, sometimes revealing aspects of themselves they hadn't fully appreciated. This opens up new avenues for personal growth and self-reflection.

“Questions have arisen about if this is a sort of Barnum effect — a horoscope that we are seeing across different patients,” Blyler said.

Despite those concerns, Blyler believes that AI-generated narratives exhibit a remarkable level of nuance and variation tailored to each individual. While the narratives may have a general quality that allows participants to relate to them, Blyler said that the crucial focus lies beyond the specifics of the narratives and their real-world impact in helping people gain self-insight.

Blyler said that positive psychology is uniquely poised to tackle these questions, especially in the context of AI. When narratives are co-created, AI becomes a valuable aid for narrative therapy in the hands of therapists, coaches, and individuals.

“AI can wield a double-edged sword; incoherent or contaminated narratives can diminish well-being, affecting self-esteem negatively, while positive ones contribute to a healthy relationship with one's self," Blyler said.