Echo Dot (3rd Gen) - Smart speaker with Alexa - Charcoal

Use your voice to play a song, artist, or genre through Amazon Music, Apple Music, Spotify, Pandora, and others. With compatible Echo devices in different rooms, you can fill your whole home with music.

Buy Now

Wireless Rechargeable Battery Powered WiFi Camera.

Wireless Rechargeable Battery Powered WiFi Camera is home security camera system lets you listen in and talk back through the built in speaker and microphone that work directly through your iPhone or Android Mic.

Buy Now

Why I need to teach my bot about suicide

0
126


Almira Thunström

This is partly comical as it involves a chatbot and its failures, very serious because the thing it fails in is suicide prevention and incredibly important because your chatbot needs to think about who will seek its help.

As someone who is a trained suicidologist, and has worked with suicide prevention as well as a suicide prevention hotline worker, even my most comical and whimsical projects take in to account suicide.

There are several wrongful myths about suicide that I will debunk in a later post but one of the relevant ones for this post is that “suicide is always planned”.

The action of harming oneself and doing so with a deadly outcome can come very suddenly, and pass very suddenly. It is often, but not always, accompanied by previous suicidal ideation.

Very often, calling the proper authorities, or a loved one, is beyond the capacity of the one overcome by pain. Even if capacity is there, one still is reluctant as one is often filled with guilt, shame and a sense of being a burdon. And there are several psychological and neurobiological explanations to that. I actually wrote my bachelor thesis on it.

But let’s get to the point for now.

If my whimsical chatbot encounters a human, who has in the heat of the moment sought help from it (perhaps clicked on a link, or seen it in a Facebook feed), I never ever want them to meet this response that was generated when I tested the words “die” and “live”:

So I’m slowly and carefully teaching it the right protocol. Creating receipts for suicide and suicide prevention. Still a long way to go, and many phrases to connect to proper and educational answers, but at least it’s a beginning.

Slowly and surly amongst all the whimsical answers, I will at least make sure that my whimsical chatbot, toy and robot will “know” how to deal with suicide, spontaneous or not.

What will your chatbot answer? Will it answer at all?



Read More

LEAVE A REPLY

Please enter your comment!
Please enter your name here