“men choose to freely train ai with their life stories to secure technofascist state” might be a better headline
Want to pay my therapy bill? And pay me for the hours of work I missed?
I don’t use it to ask for mental health advice but it’s nice to have “someone” to talk to that at least pretends to be interested in what I have to say. I used to have these conversations with myself inside my head. AI at least sometimes brings up a new perspective or says something novel.
Inb4 “just get friends dude”
Can confirm. My dad’s getting a little too into his AI on his phone. He’s got deep emotional problems and is an alcoholic, but I don’t think his bot is going to do him much good. That said, men’s ego makes it hard to open up.
To be fair, most humans either don’t wanna hear it or want to be paid a fuckton of money to hear it. It may not be the best option, and this is by no means a defense of it, but it is an option that is widely available so I understand it.
To be fair, most humans either don’t wanna hear it or want to be paid a fuckton of money to hear it.
Professionals want to pay off their student debts and live a comfortable middle class life.
Support groups need public spaces to meet and public communications to organize, preferably without being drowned out by hecklers or swamped with ads for BetterHelp or religious recruiters or scammers.
Public funding for all these things has been clawed back. Low Budget substitutes have been rolled out in place of more professional services and spaces. And social predators - from charismatic demagogues to military recruiters - abound, seeing real economic advantage in the absence of a functional mental health system.
On top of it all, we’ve got a severe social stigma against men showing any kind of physical or emotional weakness.
💯 It’s my experience that humans just don’t want to hear it or deal with it. And for people who don’t trust the mental health industry (many for good reason incl. prior abuse), it’s the only option left other than reading self-help books, websites, etc.
AI and robots will have to take care of a lot of lonely or abandoned individuals for sure, since nobody is really interested in what others do or are going through.
That is why there is a job for that. But I get you it’s free to talk to AI very accessable also compared to booking to your local therapist which there is also the act of booking a huge barrier to step into and lastly money.
No shit.
Other humans don’t want to hear about men’s mental health issues, because men are supposed to be stoic and strong and infallible, and if we arent achieving that, we’ve failed at being men.
But AIs don’t judge, and they don’t cost anything either. I’m hardly surprised.
You’re missing the point.
Something or someone who agrees with you, rarely challenges you or disagrees with you….is not something or someone that can help improve the situation and minimize recurrence.
It only feels better momentarily.
Like a drug. That costs money. See where this is going?
I dont personally speak with AI for reassurance, and I don’t think it’s a good idea to do so.
In fact, I recently commented here on a post about a teen who committed suicide at least partly due to Chat GPT - specifically pointing out the danger of depending on a machine for fake empathy when you need to be talking to a real person.
I appreciate I didn’t make that side of my position clear in the comment here in this thread, and that’s because it wasn’t the aspect I really wanted to highlight.
My point isn’t that speaking to an AI is a good idea - it isn’t - its that this is something a lot of people will obviously end up doing, and that it is men especially who are liable to succumb to this the worst because of the way society expects men to behave.
Men and teen boys especially struggle voicing their mental problems with others l, either professionally or in their personal life. So it’s no surprise they will leap at a “solution” that is free, and keeps what they say private from anyone they know. But it’s not a solution, it’s a disaster in disguise.
The thing that needs fixing here is the way mental health is stigmatised, that prevents people speaking freely and getting the support they need. That’s not a new problem, it’s the same problem as ever, and what the AI epedemic is doing is simply shining a new spotlight on that.
you’re both right. these are the prongs on the spear that’s about to mentally eviscerate a lot of people. the other one being the lack of available healthcare but everyone already knew that.
I’m fine.
What could possibly go wrong?
As a man in my 40’s who sought mental help, it’s actually pretty important. But no one should trust AI to fill in for a psychiatrist.
I run my own LLM “AI” server at home because I want to try out various aspects and scenarios without having big tech snoop over my shoulder and be able to use any model I want.
I can perfectly well see people getting good, working therapy from an LLM. But I think it would depend on the user taking the LLM seriously, and anybody with sufficient experience with LLM’s simply don’t.
So the people this could help is the people that shouldn’t be allowed near an “AI” interface…
So the people this could help is the people that shouldn’t be allowed near an “AI” interface…
Let’s see what this LLM says when I run this question 20,000 times from a clean prompt, then compare it against the same question poised more directly run another 20,000 times. Then I can pick the answer I like better and run that against a different LLM and…
So what you’re saying is that this is NOT what I am supposed to do?
I can understand it. A local llm is not only going to be more private than anything ever spoken aloud to another, but there is a giant benefit of it’s not like you even have to worry about the effect it will have on the other person. I know my past’s trauma would be painful to even listen to, I can’t imagine what some folks carry around with them.
Part and parcel of the privacy means you don’t have to deal with the judgement or shaming from others. It would be easy to get drawn into the affirming of the llm as well.
It would be nice to have my own privately hosted therapist trained on all the mental health knowledge known to mankind.
I’m sure someone has trained a model for that.
Here’s a list on hugging face, not sure how good any of these are though. huggingface.co
Having something to talk to is a massive improvement over bottling it all up.
AI is very beneficial to people who can’t afford the cost or are otherwise unable/unwilling to speak with a professional.
Unsurprising, i imagine they’re still holding back somewhat