Artificial Intelligence – The End of Humankind or a New Beginning? Part 2

Artificial Intelligence often gets a bad rap, with the focus on its risks and warnings dominating the conversation (as covered in Part 1). But what if we shifted our perspective and explored how AI could actually help humanity and even enhance our unique qualities as human beings? By examining how AI is already being used in clinical psychology, we can gain insight into the enormous potential it holds for other areas, including the workplace.

Enhanced human connection vs. further erosion of meaningful relationships 

Recent studies have revealed that chatbots can be an incredibly effective tool in treating individuals who are struggling with anxiety and depression. Take Woebot, for example. This cutting-edge app is powered by machine learning and natural language processing to deliver cognitive behavioral therapy (CBT) to users. By learning about CBT approaches and applying them to a user’s specific situation, the chatbot can help address stress, relationship problems, and other concerns. In fact, research has shown that users actually benefit from making emotional disclosures to the bot(and that this is most beneficial when the bot is not ‘disguised’ as a person). But if this is the case do we need therapists or even friends? 

Helping with our worries and acting as a sounding board

Chat bots acting as therapists do not replace the need for human interaction and therapists who are leveraging their capability recognise this. For example, the founder of Woebot, Alison Darcy who is herself a clinical psychologist says “We’re not trying to replace therapists—there’s no replacement for human connection. But we can rethink some of the tools that have traditionally been the unique domain of the clinic and design them so that they are more accessible.”

The advantages are of chat bots are numerous including that they: 

  • are available 24 hours a day
  • have a record of every interaction
  • don’t get frustrated, tired or have ‘off days’
  • can be personalized to a user’s exact needs
  • have access to endless volumes of psychological literature 
  • are less expensive than traditional therapy

Added to which they can provide an essential first step for individuals looking for help. Whereas people are often afraid to go to a therapist because of stigma, fear of the unknown and what they might uncover – a chatbot can feel like a safe space to explore. 

It’s not hard to see how these approaches can be translated to other settings such as the workplace, to help improve wellbeing, enhance relationships and enable performance. These are all factors that we have incorporated into our own platform Oka – an app for achieving goals through mentoring and psychology. 

Improving Relationships 

AI has the potential to revolutionize the way we approach relationships. With the help of sophisticated machine learning algorithms, researchers are exploring how AI can analyze vast amounts of anonymized data from therapy sessions to uncover patterns and identify areas for improvement in the therapeutic relationship. This targeted feedback can help us develop a deeper understanding of our own behavior and how it affects others, an essential component of emotional intelligence that is notoriously challenging to learn or measure through traditional means. By leveraging AI to enhance our interpersonal skills, we can build more meaningful and fulfilling relationships that bring out the best in ourselves and those around us.

This approach could ultimately be used across multiple domains – the workplace, marriage, parenting, friendship and politics among others. It’s massively exciting and also something we’re exploring at Oka with help from Essex University and our fabulous development team. 

Improving How we Think About People 

AI can even help us to think more objectively, more empathically and with a more open mind. Natural Language Processing (NLP – not to be confused with Neuro Linguistic Programming, the widely discredited therapeutic approach) algorithms can be used to analyse large volumes of text data, including transcripts of therapy sessions, research papers, and other psychological literature in order to identify patterns of approaches that provide breakthroughs. This data can then be used to help us to better understand others, improve our interpretation of others behaviour and intentions and make more informed decisions. 

Psychology Professor Daniel Oppenheimer from Carnegie Mellon University in Pittsburgh is using ChatGPT to help improve his students, saying he’s interested in teaching them “how to think like a psychologist rather than how to know what a psychologist knows”. How does this apply to people who don’t want to be psychologists? Well, thinking like a psychologist enhances our ability to have empathy, to step back from a problem and see it objectively, to reduce bias and to build emotional intelligence. 

For example, Oppenheimer teaches a course on human intelligence and stupidity, in which he encourages students to compare ChatGPT-generated text with human-generated text. Another psychologist, Professor Kathy Hirsh-Pasek uses ChatGPT, to teach “students to ask better questions and then defend those questions.”


AI can also improve the quality of patient-therapist matches. By continually tracking, modelling and comparing vast data sets it’s possible to optimize characteristics found in the most effective matches. 

At Oka we’ve been working with Data Scientists at Essex University to create a psychometrically robust algorithm for matching mentors and coaches, which will be rolled out in the coming months.

Human interaction will remain an essential ingredient for building relationships, developing empathy, and understanding others. Without opportunities for social interaction, we become isolated and disconnected, leading to loneliness and mental health issues. But these interactions could be enhanced through the use AI to guide relationships and to match people for certain needs beyond therapy such as coaching and mentoring. 

A Word of Warning 

The irresponsible use of AI is a grave concern, with many people unaware of the biases lurking in the data sets they use to train AI, or lacking the necessary knowledge to utilize AI effectively. There are also many false claims when it comes to AI – training it on a little data does not provide the nuance of response necessary to help support human decision making, respond in a way that’s optimally therapeutic or advisory or provide the nuanced insight required to address real human needs in all of their complexity.

The world of mental health apps can also be a minefield, with many claiming to offer evidence-based therapeutic support but falling short of their promises. Despite the proliferation of apps claiming to be rooted in Cognitive Behavioural Therapy (CBT), a study published by the US National Library of Medicine in 2018 revealed that not a single one of the 35 CBT-based apps they tested was actually effective. As one clinical psychologist pointed out, just because an app claims to be based on a therapeutic model, doesn’t mean that it’s actually evidence-based. This unfortunately extends to workplace platforms where sweeping claims are made, many saying that their applications are based on evidence where they simply are not. It’s crucial that we are discerning about the apps we use and that we demand rigorous research and validation before entrusting our mental health to them. 

Many platforms are also making lofty claims about their ‘matching algorithms’ which are typically based on limited data which is not psychometrically sound. My advice in all instances would be to dig a little deeper before accepting what they say at face value. To learn more about the science-based initiatives we have at Oka, subscribe to the newsletter here.


Fitzpatrick, K. K., Darcy, A., & Vierhile, M. (2017). Delivering cognitive behavior therapy to young adults with symptoms of depression and anxiety using a fully automated conversational agent (Woebot): a randomized controlled trial. JMIR mental health4(2).

Ho, A., Hancock, J., & Miner, A. S. (2018). Psychological, relational, and emotional effects of self-disclosure after conversations with a chatbot. Journal of Communication68(4), 712-733.

Lan, A., Lee, A., Munroe, K., McRae, C., Kaleis, L., Keshavjee, K., & Guergachi, A. (2018). Review of cognitive behavioural therapy mobile apps using a reference architecture embedded in the patient-provider relationship. Biomedical engineering online17(1), 1-8.

Luxton, D. D. (2014). Artificial intelligence in psychological practice: Current and future applications and implications. Professional Psychology: Research and Practice, 45(5), 332–339.


Leave a Reply