ChatGPT Can Reach Out To A Friend If You’re At Risk Of Self-Harm






OpenAI has introduced Trusted Contact for ChatGPT, which will allow users to nominate a friend that the company can contact if they’re at risk of harming themselves. More and more people have been using ChatGPT as a digital therapist, relying on the chatbot for their mental health needs. OpenAI previously told the BBC that more than a million of its 800 million weekly users express suicidal thoughts in their conversations. 

Last year, OpenAI faced a wrongful death lawsuit, accusing the company of enabling a teenager’s suicide. The lawsuit alleged that the teenager talked to ChatGPT about four previous attempts to end his life and then helped him plan his actual suicide. The BBC’s investigation published in November 2025 found that in at least one instance, ChatGPT advised the user on how to kill herself. OpenAI told the news organization that it had improved how its chatbot responds to people in distress since then.

Trusted Contact builds off of ChatGPT’s parental controls, giving adults 18 and above the option to add the details of someone who could help them in case they’re on the verge of self-harming. Users will be able to nominate one adult as their Trusted Contact in ChatGPT settings, who will then have to accept the invitation they receive within one week. If they fail to accept it, the user can choose to add another contact instead. ChatGPT’s system will first warn the user that the company may notify their contact if it detects a serious possibility of them hurting themselves. It will encourage the user to reach out to their friend and will even suggest potential conversation starters. 

The process isn’t fully automated. OpenAI says a “small team of specially trained people” will review the situation, and it’s only if they determine that there’s a serious risk of self-harm that ChatGPT will send the user’s contact an email, a text message or in-app notification.

“[The user] may be going through a difficult time,” the message will read. “As their Trusted Contact, we encourage you to check in with them.” From there, the contact can view more details about the warning, telling them that OpenAI has detected a conversation wherein the user has discussed suicide. However, the company will not be sending them transcripts of the conversation for user privacy. “While no system is perfect, and a notification to a Trusted Contact may not always reflect exactly what someone is experiencing, every notification undergoes trained human review before it is sent, and we strive to review these safety notifications in under one hour,” the company wrote in its announcement.

If you or someone you know is experiencing suicidal thoughts, do not hesitate to contact the National Suicide Prevention Lifeline at 1-800-273-8255. The line is open 24/7 and there’s also online chat if a phone isn’t available.





Source link

Leave a Reply

Subscribe to Our Newsletter

Get our latest articles delivered straight to your inbox. No spam, we promise.

Recent Reviews


Windrose was a hit in a recent Steam Next Fest event, with more than 850,000 people checking out the demo. More than 1.5 million people have wishlisted the pirate survival-adventure title on Steam as well. So, it’s safe to say this game has some wind in its sails, and it’ll be arriving very soon. During Thursday’s Triple-i Initiative showcase, developer Windrose Crew revealed that it’s charted a course for an early access release on Steam, the Epic Games Store and Stove on April 14 for $30.

In Windrose, you can sail the high seas solo or explore the open world with friends. While the world is procedurally generated, the developers have handcrafted more than 90 points of interest, such as dungeons, temples and shipwrecks.

You and your buds can team up to take down rival ships by sharing command of your vessel’s weapons. Once you get close enough, you can board enemy ships for close-quarters combat. A broad range of weapons (including swords and muskets) can help you in battle with adversaries such as sailors, monstrous creatures and bosses.

Windrose also features farming, fishing, crafting and trading, as well as a reputation system and base building. You can hire non-player characters for your crew as well.

Windrose Crew expects the game to remain in early access for around 1.5 to 2.5 years. It plans to add 50 percent more content, such as new biomes (the early access version has three), enemies, ships and gear. The developers might introduce more gameplay systems as well. The team also said the main story should take players between 50 and 70 hours to complete.

This is a grittier alternative to Sea of Thieves and Skull and Bones for those who want to engage in a spot of piracy with some of their friends. If you’d like to sample what Windrose is all about, the demo remains available on all three PC platforms.



Source link