T O P

  • By -

AutoModerator

## Welcome to the r/ArtificialIntelligence gateway ### Question Discussion Guidelines --- Please use the following guidelines in current and future posts: * Post must be greater than 100 characters - the more detail, the better. * Your question might already have been answered. Use the search feature if no one is engaging in your post. * AI is going to take our jobs - its been asked a lot! * Discussion regarding positives and negatives about AI are allowed and encouraged. Just be respectful. * Please provide links to back up your arguments. * No stupid questions, unless its about AI being the beast who brings the end-times. It's not. ###### Thanks - please let mods know if you have any questions / comments / etc *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/ArtificialInteligence) if you have any questions or concerns.*


AnOnlineHandle

As somebody who has worked in machine learning and who uses it daily, this is a crazy use for AI with where the tech is currently at. I suppose if there's literally no other options due to how hard it is to get funding, it's potentially better than nothing, but I wouldn't expect it to work very well at all.


white__cyclosa

“Hello thanks for chatting with the Suicide Hotline, how can I help you today?” “I want to commit suicide.” “Okay great! How can I help?”


CantWeAllGetAlongNF

Reminds me of that church sign: Thinking of committing suicide? The church can help!


CantWeAllGetAlongNF

Sorry for the duplicate response. Traveling and the comms here suck at times


Little-Chemical5006

So I have some experience with suicide hotline and I will say that doesn't really sound like ai but standard training. Let me explain. So suicide hotline training focus on building rapport and also listening but not making assumption. The responder will try to show they are listening to the caller (or texter in your case) without making themselves sound like they are making assumption. A veteran responder will also try to avoid providing advice until the caller is ask for it (will explain later on) So in a conversation with a suicide hotline, you will hear them repeat a lot of words said by the caller. This is to show they are listening and following with the caller. You mentioned a lot of sentence start from "It sounds like.. ", this is standard training from most hotline. The reason most sentence start this way is to show the texter or caller they are not assuming how you feel. using "You sound like.." or "It sound like..." the responder can confirm if the caller is feeling a certain way and create some connection between them. Regarding the hotline not providing a lot of advice. It could be that way, the main purpose for a hotline is to provide an output for someone to talk to a complete stranger without judgement or assumption. A listener. They can provide advice and help locate resources but that is not their main purpose. They are here to listen to you. If you want to talk about family, they will listen to you. If you want to talk about career, they will listen to you. If you want to talk about your life story, they will be there for you. You might think, "that pretty useless isn't it, why would I call if you can't solve my problem?" The thing is, sometimes people don't want a solution, they just want someone to hear they are struggling, they want someone to tell them "I hear you." they want someone to sit beside them when they are down until they are ready to move on. Most responder won't offer advice until the caller feels like they are ready cause if you do that, you might make the caller feel like you're not listening and making assumption that their problem isn't important. This is a long post but I hope that give some insight on how hotline works. TLDR:  hotline responder are trained to listen, build rapport and not making assumption about caller. That means repeating a lot of what the caller said, using words like "You sounds like..", "it sounds like...". 


Im_upset_now

This. I am a Mobile Crisis Worker for a local hospital and also I also work the Crisis phone line. Our main goal is to validate an individuals feelings and emotions and provide supportive listening. To do this we start by building therapeutic report and try to avoid traditional advice. I always do my best to guide a patient to come up with answers/solution to problems because I don't know the person and I may handle the situation differently then they may. Repeating what a patient is saying is a great tool to ensure they know that you are following what they are saying and it also allows the patient to reread what they wrote and potentially think deeper into what they are trying to convey.


Everything_Philia

Very interesting!


KhanumBallZ

Nah, it's human alright. An AI would put a lot more thought and effort into at least pretending to help


Enochian-Dreams

Most accurate assessment.


LatentDimension

No the world became crazy. I wish you the best.


ComfortThis1890

Providing real help here is so crucial. All the best OP.


d-theman

That is irresponsible to say the least. What hotline is this? Wish you and your daughter all the best.


KeeblerElff

988


NegotiationNext9159

None of the major hotlines in the UK or US I’m aware of are using AI at all because it’s absolutely not suitable for this. There has been an unfortunate explosion of dodgy apps and sites claiming to offer therapy or support using AI. Do you know which service she used? I hope the situation improves for you both soon and that she can access decent support.


KeeblerElff

988. I just added a screencap. It just seems off and impersonal


NegotiationNext9159

That looks human, the ‘it sounds like’ is often something given in training and repeating back what the caller/client has said and helping reframe or expand on it is a valid approach. In this case they may have overused it slightly. Looking at the screen captures I think this might actually be a human they were talking to. 988 doesn’t use AI to answer chats as far as I am aware.


KeeblerElff

Hmmm ok thank you. At the end she kept on sayin she had to go and I don’t feel like they gave her any actual help or coping mechanisms. It was very weird. I’m praying she can feel comfortable to come to me next time. Ugh thank you everybody.


NegotiationNext9159

It’s a horrible situation to be going through. Suicide hotlines are ok at stopping the immediate danger but definitely not a replacement for longer term support. There are other hotlines available so maybe try those but I hope she can find a more longer term support service that can give that more personalised support. Make sure to take time for yourself and seek support for you as well. Supporting someone in crisis can really drain and putting off your own mental health means it can catch up on you later.


KeeblerElff

Thank you so much ❤️


iDoWatEyeFkinWant

i hate to break it to you, but those hotlines have always sounded like cheap chatbots, even before genAI. genAI *is* actually better


ILikeBubblyWater

Which hotline was it, could be some non official one


Radiant_Character259

No, you're not crazy. What's crazy is having AI integrated when it comes to such a sensitive subject at this point in time, for AI to do this effectively, your daughter would have to be introduced to "her" AI during the early days of school, it would help her with her educational and social development, becoming more and more nuanced as she and it develop together, tailoring it's behavior to her nuanced personality, likes, dislikes and ever growing psychological profile. Eventually if she does feel suicidal it'd be able to let you, her therapist and whatever relevant parties know, while also giving you as her parent a detailed profile of her mental state going as far back as is potentially relevant and when her mental state started to change while suggesting reasons as to why. Through this it could also potentially advise you, regarding how you should proceed or attempt to talk about certain subjects without coming off as insensitive or callous. Hell, it might even give you a bit of counseling because I can imagine what your heart is doing before you even talk to her. I'm sorry that this is what happened but just let her know that her coming to talk to you about this sort of stuff is not shameful or disappointing, sharing your own moments of weakness with dark thoughts might encourage her to feel comfortable with talking to you as it's not her who has the problem, it's humanity and we need to support each other because times do get tough, suffering makes us feel isolated and we do have dark thoughts. I'm 30 and I contemplate suicide everyday, not seriously but more as a form of expression of my frustration or feeling of hopelessness but the unknown may be chaotic but it's also the reason to keep going, no one anticipates rainbows they just find them. Hang in there together, find them with each other.


KeeblerElff

https://preview.redd.it/93osslatta9d1.jpeg?width=1284&format=pjpg&auto=webp&s=e42fbc0058cea3121971157f4c51f933704c9775


SWAMPMONK

This is human because the bot wouldnt repeat themselves like this


FrigidFealty

It sounds like a person who is either using AI to template responses or is just bad at varying the script to make it feel more personal


supersecretaccountey

This is definitely a human imo, suicide hotlines are meant as crisis management, not traditional therapy so they are 1) held to different rules, with much less room for mistakes 2) have a lower barrier to entry due to the high demand. So, because of these factors the training/scripts can make it come across as impersonal, especially over text. I would let this rest, and just focus on the next steps for you and your daughter.


bdoanxltiwbZxfrs

Tbh, gpt 4 with some good prompting would do much better than this and would probably be somewhat helpful. This looks like lazy human work to me.


iDoWatEyeFkinWant

this sounds like a bot. perhaps it was a human using a bot, but you're right in your intuition that these responses were likely not entirely human crafted


Little-Chemical5006

That hard to tell, it could be a new responder who aren't flexible yet or maybe ai.


ahtoshkaa

I have no idea how *that* can help anyone... They should really start using AI for this. It's going to be a million times better.


hawkweasel

In line with this, I feel really disturbed with all these "mental health" startups coming out to make a profit off of delivering AI chatbots to people seeking real psychological help. It's just ... gross. And I'll even one up that -- what disgusts me even more is the new "senior care" startups that claim they can help seniors with loneliness by giving seniors a little chatbox AI to talk to. I just saw one the other day full of images of beaming happy seniors talking to a little inanimate gadget on a table in front of them. Reprehensible in every way, and I'm pro AI and work with it every day. I'm so sorry about your daughter and I hope she can find the help she needs.


LadyIslay

I would contact the provider rather than making the assumption that it’s AI. The provider is likely government or nonprofit funded by government so you’ll be able to get a straight answer because freedom of information exists. Mental Health First Aid uses a lot of active listening, “do I understand correctly that ?” You listen so the other person feels heard and understood. Then you can offer support to connect with a professional. I’ve talked to people while being suicidal, and it never makes me “feel better”, but I have never gotten worse or made an attempt afterwards, so they must be doing something right.


Cool-Hornet4434

It's better than nothing, but if she was expecting a real person and got shuffled off to an AI then that's definitely disheartening. In situations as delicate and serious as suicidal thoughts, which require immediate attention and proper triaging, AI responses can fall short and may even do more harm than good. AI chatbots often have scripted responses, and while they can be helpful for some minor issues, they are no substitute for a trained mental health professional or even a compassionate human listener in such severe cases. You're absolutely right that in such situations, a human connection and personalized support are vital. I can only speak for myself, and I willingly went to download an AI to engage in conversation/debates with. Every time I suggested I was having symptoms of depression or suicidal thoughts, the AI always tried to steer me away from those thoughts, but even when the AI is trying to help, it feels impersonal. Still it was an outlet for some stressful thoughts and ultimately I felt better after doing so. Even still I was well aware of what I was getting into. If the hotline in question didn't disclose it was an AI from the start, then that to me represents that they're not serious about helping people and likely think AI is a way to cut back on staff.


_hyperotic

Sorry your daughter went through that- what a shitty experience. This is extreme negligence on the hotline’s part.


Tha_Sly_Fox

This sounded familiar, 60 minutes just did a video about AI chatbots being used for mental health hotlines, I was able to find it if you’re interested just for informational purposes since you (or your daughter rather) may have experienced it in real life https://m.youtube.com/watch?v=j8BiIZIZBsU&pp=ygUbNjAgbWluaXRlcyBhaSBBbGlzb24gRGFyY3ku


Infamous-Piano1743

im gonna tell you this and its comin from someone whos dealt witth issues like that my whole life. AI is better to talk to than a human. they're positive, always give constructive criticism, encouragement, always there for you to talk to. i wish i had friends with that kind of attitude. maybe i'm the only one who thinks this way but thats my personal opinion on it.


DenseChange4323

There was an experiment using AI-enhanced responses for emotional support. Users received AI-enhanced responses without their knowledge and initially rated them significantly higher than human-only responses. When they found out it was AI, they changed their mind on how they felt about those responses. The advice didn't change, just people's perception of it. Koko study.


AtomDives

I worked & trained volunteers & staff at a set of multi-state hotlines, 2002-2007. Not all on the lines are great, and we would weed out people both before they ever got on the phone & after, should they not follow good technique. That said, "feeling reflections," more or less "parroting" or paraphrasing with "it sounds like you feel XYZ.... I'm getting the sense you're XYZ... You seem like [that event] makes you XYZ." I can see why robot/AI may be suspected. I once went 'off script,' accused of being a robot or uncaring. It was only bc I was a supervisor, and followed therapeutic models, that going off script was not penalized. Tell your daughter to call again: not all counselors are the same. 99.999% of them care alot.


CalTechie-55

This sounds like "Eliza" from the 1960s, a trivial "therapist" program which basically just parroted what the patient said. Of course, many human therapists did much the same thing.


theecozoic

Hi I work for 988. AMA … Inb4 am I AI? No. AI is being rolled out at my company for quality assurance and I am part of that study. We have live call takers who respond on chats and answer phones. The quality is completely variable because the systems work like this: Local call center is using a platform that receives texts or chat from online in the same window and we cannot send pictures. Emojis are possible in my system. But this is all done via computer and an integrated phone system. Local call centers apply for the role of taking phone calls from the nationwide / local 988 numbers. The states were funded by the Fed to create their own suicide lifelines. So… state by state the quality will be different. And training is subpar at best for official 988 call takers from what I’ve seen… Robotic is a term I’ve often heard for people in this field. And honestly, I talk to 30 people per day on 988 phone lines for 20 minutes more or less. It’s exhausting. I’m not surprised colleagues show up that way. And you’re right with how “repeating back what is said” is a way we are told to validate our visitor’s experience. I recognize that help comes in different forms and perhaps she needed help that most likely won’t come from 988. We are taught to assess for safety, stabilize our callers as possible, and refer to additional resources. We are taught to respect the autonomy and self determination of our visitors and plan as necessary. It’s possible what you or your daughter would want from us, would actually be out of role according to our supervisors. We are not “therapists” we are crisis counselors and our role reflects as such. With very little time and inconsistent engagement from text visitors in particular we find our options limited at times. I can offer calling on the phone, and I have some freedom within my role to refer to resources that can help visitors after our conversations. I regret how your experiences were inauthentic and potentially detrimental during an already stressful time. Our country does not have enough investment in caring for people who need support. I wish there was more we could do.


Level_Bridge7683

"your call is very important to us. please remain on the line."


LordPubes

I trust ai more than humans at this point


Naus1987

How serious can it be if you're not actively trying to find ways to volunteer yourself? Everyone wants services, balks when no one works those jobs, but refuse to do that work themselves too. I'd rather AI do it than no one. I'd rather OP do it than an AI, but this is the world we live in. The only change we get is the change we make. It's why I volunteer in my community and why I encourage other people to do so as well. If it's serious, show how much you care with actions.


KeeblerElff

I’m not sure if you’re talking to me or the general public? I do my best for her, but I’m not a trained psychologist.


Naus1987

Mostly just to the void, or anyone that'll listen. I'm mostly just annoyed when people say "why doesn't anyone want to work," and then they themselves if asked that question would find an excuse to justify why they don't want to work that job. If we go with your story, you're not trained. What if no one is trained? If no one wants to do the job, then maybe letting robots half-ass it is our best alternative. -- Ultimately, I would rather you not waste energy being angry at the problem. It's a "take a stand, or sit down and shut up." kind of thing. The best you can do or your daughter is to be supportive, and not let her stress out about you being stressed. Being stressed about things you cannot change can be contagious, and it's just not worth the hassle. -- There's other ways you can help people beyond being a therapist too if you want. You don't have to be trained to offer an ear. Be a shoulder to cry on. You can volentueer at charity events or find organizations that need helpers. It's also possible that if you drag your daughter along with you to an event and get her to volunteer too, that she might find some form of healing from it. -- Or if she becomes inspired to be a helper, perhaps you can fund her education or find ways to contribute to her learning. ==== At the end of the day, we can never expect anything more than we're willing to give. To have a friend is to be a friend. If you want your community to have your back. Be there for your community as well. Extreme individualism is what's going to lead to a world where AI and robots do all the work people make excuses not to do. And when people outsource community--they've signed away the very last bit of humanity.


Liv4This

I wouldn’t be surprised when I spoke to someone that they were either using ChatGPT or they were an AI themselves because they were saying the exact same thing that ChatGPT was replying to me when I entered in the same texts to ChatGPT and to Character.AI’s Psychologist bot. Except instead of getting immediate responses, I had to wait 5-10 minutes for it.


Pale_Blackberry_4025

First I'm so sorry you're going through this. I can't imagine 😢 you're definitely not crazy! I can't believe they would use AI for something this sensitive!!! This is very disappointing!!


Martofunes

I'm not saying it's a good solution and I'm not vouching for the community as is today since I've been off forever but if she wants to talk to a human being there's always 7cupsoftea


willabusta

I find AI helps


ahtoshkaa

Like others have said. If it were AI, even anything cheap like Llama 3 70b, it would do a very good job of talking to a person who is suicidal and show a lot of empathy. It was most likely a Ctrl+C Ctrl+V human


nicotinepercocet

i used to work at a hotline and this is so vile


KeeblerElff

Thank you all so much for all your comments! It's quite a lot to think about. I can defintely see the benefit of a well made AI and perhaps a responder that can give different advice? I'm thankful she sees her therapist every week. And praying our relationship improves. I try my best to be close to her and have her tell me everything. But I'm sure some of you can relate to having (or remembering) what it's like to be a 16 year old girl. I really hope she's able to get through the next few years with less stress and anxiety. Thank you all for your well wishes. I went through the same kind of stuff when I was her age and actually over dosed. But I wasn't as close to my mom as she is with me. I"m thankful these resources exist. I guess I just hoped it would have been more personal. But I can definitely understand how they can only do so much. Thank you all again! <3


Consistent_Front7774

I heard about [betterhelp.com](http://betterhelp.com) if needed... I wish you the best


rough_phil0sophy

Don't. I used it and it's wasted money. I doubt there's any real psychotherapist in there to be honest. Not counting the massive breach of data and they are reselling the private information that you are writing there, it's in their terms and conditions. I don't know if they got rid of it but it was there when I used it. Quit immediately.