Top

A black box without empathy

It has become imperative to discuss how algorithms are dealing with sensitive issues like suicide.

Social media platforms are said to be making positive changes in the lives of people around the world. But when it comes to suicide prevention, these platforms aren’t smart enough. Case in point is a tweet by Shehla Rashid, a PhD student at Delhi’s Jawaharlal Nehru University, wherein she spoke of Premenstrual Dysphoric Disorder (PMDD), which is an extreme form of Premenstrual Syndrome (PMS). In this extreme scenario, one feels suicidal. A dejected Shehla searched for ways to commit suicide. She looked on Quora, which is known to have intelligent answers, but the next day she received an email. She wrote, “Quora sends me an email asking if I’m still contemplating suicide, and that they’re here to help! In a world where algorithms will help you end your life if you want to end your life, it’s really important to share information about PMDD. (sic).”

It has become imperative to discuss how algorithms are dealing with sensitive issues like suicide. As suicide or anxiety are emotional issues, how can these platforms be made more sensitive?

An expert and security researcher Anivar Aravind said, “Responding to such searches should not be an engineering decision. It needs to have a social or psychological consultation, which is absent in most tech companies. These algorithms are black boxes as in except for the company, nobody knows how that product is programmed. The output of the algorithm reflects the sensibility of the product manager who wrote the program. Additionally, it is supplemented by the human bias of the developer or company.”

However, sending emails based on searches has been a norm for many platforms. People who have such thoughts often tend to go incognito and search. However, since Shehla gave up on everything, she logged into her account directly. Because of this, she got an email asking, “Still curious about which would be least painful death? Jumping off a building or jumping off a bridge?” Doctors feel that when a person is asking questions related to suicide, it helps if the responses are more empathetic. Dr Diana Moneteria, at Hyderabad Academy of Psychology, said, “When we do suicide prevention training, we teach people that asking others about suicide doesn’t increase the risk of suicide. But there is a caveat, how you ask makes a difference. If a search engine is sending machine-generated emails with no person involved, the question is ill-advised. A normal person would have gotten help instead of giving ideas to end life.”

People are suicidal only because they have a problem that they cannot solve.

“On social media, looking for posts like ‘I want to kill myself’ or doing a Facebook Live are signs of correction or of wanting help. It would have helped if the machine said ‘go get some help’ instead of giving options on committing suicide,” she says.

But certain measures have been adopted by some tech companies to prevent suicides. Facebook, Twitter and Instagram employ artificial intelligence to detect signs of suicide and depression. Using algorithms, users searching for a banned hashtag or specific words related to harm are redirected to the support system. Yet there are no guidelines on how to deal with such issues and every tech company handles it in its own way.

“If a search for suicide or killing is detected, the systems should identify them and it should be backed by human decision,” said Aravind. While Shehla looked up Quora, bigger platforms like Google provide helpline numbers. On Google, if you search for suicide-related terms, it offers a helpline number. Dr Omesh Kumar Elukapelly, a psychologist said, “Such responses by Quora at times might act as a trigger considering the personality of the client and given the circumstances. Google came up with the idea of providing suicide helpline numbers when a person Googles the methods of committing suicide. This helpline works many times as there are many patients who say that ‘I Googled this but got your number’.”

Nonetheless, there is a need for a better algorithm with less mathematical calculation and more human intervention. Even the developer of the program needs to be sensitised to the issue.

( Source : Deccan Chronicle. )
Next Story