When we think about artificial intelligence (AI), we’re often reminded of its dark side. Futuristic robots taking over the planet, technological warfare that doesn’t play by the rules of humanity – that kind of thing. But what if AI could be used for good? And what if we don’t have to wait?
That’s exactly what PhD candidate Prabod Rathnayaka thought, from La Trobe’s Centre for Data Analytics and Cognition (CDAC). He was reading an article online about the role of technology in the future of humanity by the Australian Red Cross. It sparked an idea.
“I was already working on developing AI algorithms for chatbots,” he says. “I thought, how can I use my skills to contribute to humanity, through ‘public interest technology’? The two trains of though came together.”
Prabod decided to develop a human-centric Artificial Intelligence chatbot to help fellow students who are experiencing mental health issues such as anxiety and depression.
The chatbot is named “Bunji”, in acknowledgement of the Traditional Custodians of the land in which we reside. Bunji is a term for “a mate, a close friend, a kinsman” in the languages of the Walpiri people and other groups in the Northern Territory and northern Queensland. Prabod and his team worked with the University’s Indigenous Strategy and Education team and consulted with Traditional Owners to decide on the name.
Using AI for good: How Bunji works
Bunji is based on a psychological treatment called Behavioural Activation (BA). Supported by a large evidence base of clinical trials and studies, BA is highly effective in making life meaningful and pleasurable, by breaking the vicious cycle of inactivity that precipitates mental health issues.
BA aims to reverse the effects of the lack of positive reinforcement, by increasing engagement in activities associated with the experience of pleasure or positivity and decreasing engagement in activities that sustain depression or increase the risk of depression.
However, BA in its organic format is a rigid clinical treatment that is not directly appealing for continuous use by a large majority of society, let alone students.
This is where the human-centric AI algorithms and Natural Language Understanding (NLU) models developed by Prabod and the CDAC team are transformative, in delivering an interactive and engaging BA experience through Bunji.
Throughout the BA process, Bunji’s human-centric AI initiates a conversational style check-in, detects the emotions of the person and responds empathetically, while also proactively motivating the person to schedule activities, perform the activities and reflect on the outcomes of the activities. Based on the outcomes of the completed activities, Bunji then personalises future recommendations to activities which have been positive and enjoyable to that person. The emotions expressed during each session with Bunji is available to the user in an intuitive visual format, over time, across activities and in summary. Bunji also provides a gratitude journal to record and reflect on positive events, as well as a curated selection of inspirational quotes and biographies that can persuade BA.
Privacy and confidentiality are paramount to Bunji, so all conversations are anonymous and transient. Bunji conforms to the AI ethics guidelines of privacy, transparency, robustness, and accountability.
Prabod emphasises that Bunji does not replace existing healthcare services but aims to be a constant companion that provides conversational support and helps to break the vicious cycle of inactivity through the practice of BA.
The design and development of Bunji was funded by a grant from the Transforming Human Societies RFA at La Trobe University.
Developing the app
Prabod developed the app alongside his PhD supervisors, Professor Damminda Alahakoon, Associate Professor Daswin De Silva and Professor Richard Gray.
“Richard’s expertise is in psychosocial interventions for people with mental health problems, and he has access to psychologists and mental health experts at Orygen Youth Health,” says Prabod.
“The type of AI that I develop in my research centre is brain-inspired and aims to mimic the human thought/emotions process. We call it self-structuring AI. So, when we have AI that thinks and feels like humans then it is quite effective and useful to address the Big Data challenges in mental health, healthcare or any other domain.”
Prabod hopes through Bunji, his AI knowledge and skills can be used for the good of humanity.
“I’d be ecstatic if this app can make even a small difference in the life of other students. This would be a brilliant outcome for myself and our team.
“AI has a bad rap these days, but used in a safe, transparent and trustworthy manner, AI can deliver immense social and personal value.”
Help make Bunji better
Prabod and his team would like your help to make Bunji better. Register your expression of interest to participate in this study.
You will be required to download and interact with Bunji. You will then be required to provide your feedback on the experience through questionnaires and by participating in focus group discussions on how we can make Bunji better.
You can support Bunji and Prabod’s research by downloading the app using following links.