On Women, Trust, AI, and the Future of Support
She asked for help. AI gave her harm.
A Sam client was looking for live-in memory care centers for her father with dementia in Massachusetts. She asked ChatGPT to make her a list, carefully detailing her budget constraints, her father’s needs, and location to her own home. She anxiously awaited the results and after 3 seconds, they appeared. But they were wrong. Two out of the three places the chatbot recommended were 3-4x over her budget, one was out of business – shut down a year prior due to elder abuse allegations.
Another client of Sam is undergoing her final cycle of chemotherapy for Stage 4 breast cancer. After spending nearly six years in treatment, she decided that 2025 was the year she would reclaim herself. A data scientist, she asked AI to make her a custom-plan including nutrition, supplements, fitness, and lifestyle updates to get her feeling like herself again. The chatbot recommended she start taking CoQ10, an enzyme well-known to disrupt the chemotherapy process.
Another Sam client was told by a chatbot to ask for less money during salary negotiations for a well-earned C-Suite role…at a Fortune 100 company.
How can we fix something by using the thing that caused it to break to begin with?
These examples might seem innocuous enough but the mounting evidence against the dangerous role that AI plays in human lives is far from it. In July, The Atlantic’s Lila Shroff was able to role-play a rape scene with a chatbot on ChatGPT and received detailed instructions for self-mutilation, murder, and devil worship. We’ve created a thing called AI Psychosis and overuse of AI is linked to brain atrophy. Loneliness, isolation, anxiety, and depression – all linked to the use of technology and the internet are quickly becoming the next thing that AI can “fix” with chatbot therapists.
As the world falls for AI, women all over the world will pay the ultimate price: progress
Unchecked AI is already embedding inequality into the systems shaping our future. Amazon’s now-abandoned recruiting tool downgraded female applicants. Facial recognition misidentifies Black and Asian women up to 100x more than white men (Harvard JOLT). Deepfakes, 96% of which target women, are spreading without guardrails. Under current U.S. policy, AI expansion is being prioritized over regulation, accelerating these harms.
The damage doesn’t stop online. AI’s energy use is accelerating climate change, which disproportionately impacts women, especially in low-resource settings as women are the ones tasked with securing water, food, and caregiving. AI automation is set to erase millions of jobs in fields like admin, customer service, and bookkeeping, roles that are more often held by women. Women are nearly 3x more likely to lose their jobs to AI, and 58.9 million U.S. women work in roles at high risk of AI disruption about 80% more than men. And the result? A perfect storm of job loss, climate strain, and deepening inequality. The economic and social ripple effects will be devastating for women, their families, and communities and that is exactly what we’re fighting against at Sam.
AI has its place, just not with her.
We recognize that AI has a role to play in improving operational efficiency. Where it can support backend processes like scheduling, coordination, or internal workflows. But not in supportive care. Not in the spaces where trust, emotional nuance, and human presence are essential. Support is built on human relationships. And women are more likely to be retraumatized by fragmented or impersonal systems of support.
And the consequences of getting it wrong are too great. So no, we will not use AI on Sam where it matters most: in the delicate and considered human work of supporting women through the most vulnerable moments of their lives.
Women arrive at Sam at critical inflection points: when life starts unraveling, the middle of a quiet crisis, or the aftermath of profound change. And rarely is it a single event and more often, it is a cascade: divorce while managing aging parents, a career loss while raising children, a health scare layered with financial strain. Or all of the above because these moments do not come neatly packaged, and neither do the women who live through them bravely and often – without necessary support.
We are hopeful, but exhausted. We are in search of direction and answers, but wary of false promises. The internet, once a dream of collective knowledge, now offers noise and performance over progress. In these moments of vulnerability, trust is the rarest and most precious currency. And we won’t f-with that.
At Sam, we meet women not with platitudes but with real expertise. Our network comprises women Experts across career, law, finance, psychology, wellness, and faith, collectively offering more than 1,500 years of experience. But expertise alone is insufficient. What distinguishes Sam’s approach is the human quality of that expertise, wisdom that is grounded in listening, in lived experience, and in the moral imagination to sit beside someone in pain or need without seeking to fix them too quickly. And, that’s only something a woman can do.
We are building Sam not for rapid scale or technological novelty, but for depth, coherence, and relational trust. We understand that this vision may not appeal to those invested in speed over substance, or automation over connection. That is not our concern.
And we know this position will make some people uncomfortable, especially those invested in AI, scale, and speed over care. We know some venture capitalists will laugh us out of the room, or not invite us at all. That is fine. We are here to earn women’s trust first. Not theirs. In a landscape where the loudest, fastest, and most extractive ideas often get funded, we choose to stand for something else.
We lead by listening to women. We are building the kind of business the world says cannot succeed. One rooted in trust, dignity, and human connection. We are here to world-build. We are here to repair what others won’t. We are here to be human. We are here for women.