Val Odiembo volunteers at her former high school a few times a month, teaching teens about consent and healthy relationships. Now a sophomore at Rhode Island College, 19-year-old Odiembo isn’t much older than the students she’s teaching – which she thinks makes it easier for the high schoolers to come to her with their questions. But she knows she isn’t the only source they’re consulting.
“A lot of them confide in AI,” she said. A recent UK study found that one in 10 young adults has consulted AI for sexual health information, and a 2025 Pew Research Center report showed that one in five teens have had a romantic relationship with a chatbot.
Odiembo says sometimes students turn to AI to ask for advice about talking to a crush; other times, it’s for reassurance that they didn’t cross a line in a relationship. As a peer educator trained by a youth-led non-profit focused on ending sexual violence, Odiembo says that worries her.
Artificial intelligence will “just tell you what you want to hear”, she said, citing research that AI chatbots tend to affirm users’ actions even when they are harmful. “I think we’re also losing human connection when we confide in AI instead of the people closest to us.”
So when Safe Before Anyone Else (SafeBae) – a survivor-founded, youth-led non-profit working to end sexual violence among teens – began developing an online tool for young people questioning whether they may have caused harm in a relationship, Odiembo was thrilled. “I think it is the best thing SafeBae has done since sliced bread,” said Odiembo, who also works as the organization’s youth programs manager.
In mid-March, SafeBae launched Vibe Check, a free and anonymous automated alternative to AI and online forums, designed to talk young people through understanding whether they violated a partner’s consent, apologizing (when appropriate) and making sure they don’t repeat such behaviors. As of late April, the site has had more than 3,500 unique visitors.
Vibe Check “is intentionally not AI”, said Shael Norris, co-founder and executive director of SafeBae. “It was built by our team based on over a decade of direct work with young people.”
Vibe Check users can click through a range of questions to get support working through their feelings (“The person I was with is mad and I’m worried I did something wrong”) and reflecting on what happened in the moment (“They seemed upset or distant after”). Depending on the scenario, Vibe Check offers mini-lessons on the nervous system’s freeze response, consent laws around alcohol and grounding exercises (“If someone became distant, upset, or quiet afterward, that can be a sign they didn’t feel okay about what happened. You don’t need them to ‘prove’ anything for their feelings to matter. Accountability here can look like listening, apologizing without pressure and respecting whatever boundaries they set.”).
SafeBae’s director of strategic initiatives, Drew Davis, was starting college six years ago when he noticed the growing phenomenon of people turning to Reddit forums with “really difficult and vulnerable questions”. At first, he studied forums for people recovering from eating disorders, but as he dug deeper, he found a well of forums for people who were questioning whether they had perpetrated a sexual assault.
“It keeps me up at night how bad these responses” in the forums were, said Davis: “Either you’re an awful person, there is no such thing as accidentally causing harm, you definitely did, to the extent that you should kill yourself – or you did absolutely nothing wrong, you’re perfect, women suck.”
In recent years, Davis said, he had only seen those two approaches grow more entrenched. In the wake of the Epstein files “and our frustration with how really powerful men haven’t been held to account”, he said, we’ve seen communities wanting to hold younger and less powerful men absolutely accountable for their actions. At the same time, “we have the manosphere, where people on these Reddit threads are literally connecting and creating communities on Telegram or on Discord, where it is about hating women and joining together”.
As adolescents report record levels of mental health distress, Davis found himself concerned by growing levels of suicidal ideation among girls who had survived sexual assaults and boys who were increasingly isolated from their peers. He wanted to create a tool that could give people who may have caused harm in their relationships “an off-ramp to rejoin society, be in relation with other people and move towards accountability and repair for genuine mistakes” and “take the onus off of survivors for having to lead these efforts for repair and apology”.
At first, he imagined developing a tool as an alternative to Reddit or online quizzes, but as AI technology continued to develop, he wondered whether it needed to fill that space. “AI chatbots can be sycophantic,” telling users what they want to hear, he said. As he began seeking feedback on Vibe Check, he said SafeBae’s youth board of directors appreciated that it offered “gentle, caring, compassionate pushback and redirection”.
Researchers are still working to understand the effectiveness of AI for teaching sex-ed topics like consent. While AI tools show “real promise in making sexual and reproductive health education more accessible, private and user-friendly”, they perform better “when communicating about more straightforward topics, such as contraception” than “when discussing more complex or sensitive topics, such as abortion or sexual pleasure”, said Scarlett Bergam, a graduating student at the George Washington University school of medicine and health sciences and lead author of a recent review of AI sex-ed tools.
When Vibe Check launched, 17-year-old Apollo Knapp clicked through the tool with his high school classmates, trying out made-up scenarios. As a member of SafeBae’s youth board of directors and a peer educator working with middle school students, he was impressed that it was “so comprehensive”. He hopes he can point preteens to it before they come across any AI chatbots.
“If humans are messing up consent this much, I don’t even want to see what a robot’s going to do with it,” he said.
