pull down to refresh
1191 sats \ 1 reply \ @anon 29 Aug
Sam talked about this in a tweet about GPT-5
My girlfriend is bipolar type II with complex PTSD. She can either have a month full of delusions or a month with just 1 day of delusions but a lot of anxiety. Everything is linked to her past or traumatic events.
Now why I'm saying this, 3-4 months ago I gave her a GPT subscription to have a lifeline in case of crisis that I cannot manage as sometimes the toll is too big to bear and I really wasn't able to manage that with my work, even ended up writing my boss.
She was happy with it, successfully managed to have a source of reassurance about reality while also learning tricks to self-manage.
But a month ago she told me she stopped using it, she felt that GPT-4o was just running in circles, and when I read her prompt I got the same overwhelming feeling that I get when I try to manage her crisis. I think she gave anxiety to GPT.
GPT-5 on the other hand, in full thinking mode, was very helpful, albeit slow.
I actually don't care what people think about Sam Altman, but the guy is trying, considering he has a sister with the same problem as my gf's.
Progress is being made, and I'm just happy that it's making our lives a little bit easier.
reply
0 sats \ 0 replies \ @nitter 29 Aug bot
https://xcancel.com/sama/status/1954703747495649670
reply
371 sats \ 2 replies \ @k00b 29 Aug
I read this a few days ago. I enjoyed the attempt to collapse the milieu of bipolar disorders into a tendency for one's disturbed sleep to cascade.
I also agree with much of the essay's point: AI isn't making sane people insane; it's making some people, on some border of sanity, cross the border.
reply
33 sats \ 0 replies \ @kepford 29 Aug
Yeah... here you go again being rational @k00b. Its more fun to blame your problems on one externality.
reply
0 sats \ 0 replies \ @carter OP 29 Aug
Also seems similar to drugs... Some people who have a predisposition to psychosis may be triggered but it isn't CAUSED by the drug
reply
193 sats \ 1 reply \ @freetx 29 Aug
This is a great piece, enjoyed it.
Although I find his premise entirely reasonable, I'm continually struck at how little introspection goes on within the HN / Silicon Valley crowd.
He does a fine job in his piece of coming up with a plausible theory that LLMs can exacerbate existing mental illness....however he never questions if: Are we guilty of causing this by how we framed the tech? That is , was it a good idea to spend 5 years exhaustively misrepresenting this tech as conscious to the general public?
Everyone in AI space loves to get on stage an pontificate about how much of a disaster our new invention will cause. The examples given all strongly infer that AI = human level intelligence.
Elon is now joining with his arch-rival Sam Altman and backing UBI calls, since their pattern matcher is going to cause mass unemployment (once it can operate a McDonalds drive-thru apparently). Quite interesting that they found something they agree on: Namely, their invention is going to change every aspect of the world.
These type of strongly suggestive displays from techbros have more than a little to do with why those with latent mental health problems are being negatively impacted by LLMs.
reply
18 sats \ 0 replies \ @kepford 29 Aug
I can't take Altman or Musk at face value. I don't think they are genuine people. Musk showed this in his bitcoin journey and Altman with his claims that AI needs to be regulated because of the dangers (as if technology like this can be controlled). To be they are both skilled scammers. Smart scammers, but scammers.
reply
50 sats \ 0 replies \ @chaoticalHeavy 29 Aug
This seems like a description of US here at Stacker News.
reply
0 sats \ 0 replies \ @OT 29 Aug
Licence required to use an LLM. For your own safety.
reply