Welcome to WeLivedIt.AI
Innovating how we make inclusive online spaces
The Current State of Online Spaces
73%
of women journalists exposed to online hate speech reported experiencing anxiety, stress, and fear.
UNESCO study, 2021
84%
of LGBTQ+ adults experience significant online harassment, with 52% experiencing severe harassment.
GLAAD's Social Media Safety Index, 2023
78%
of employees say reading toxic messages in work communication impacts their mental well-being.
Microsoft's Work Trend Index, 2023
How it works
Community Values First
Your community defines what matters. Create your organization, set your values based on your unique needs. The LLM will tailor how it works for you according to these values.
Smart Detection
Our text scanner enables you to scan documents to check for hate speech and gives you the option of not having to engage with something that could seriously impact your mental health.
Learn & Improve Together
- Community members can submit content that impacted them negatively, with additional context.
- The community can discuss these experiences and vote on what should be added to the model as training data.
- Models improve based on real community feedback, not just algorithms.
Share What Works
When your community spends time developing an effective model that can moderate according to their values, the trained model is recorded on the blockchain. This means it is available for other communities with similar values. For example, if a women's tech community creates an effective model for inclusive language, another organisation working with marginalised genders in tech can immediately adopt and build upon it. Each community that uses a shared model helps improve it, creating a growing network of trusted, community-tested solutions.
Ready to Create a More Inclusive Online Space?
Join us in building a better online community experience for everyone.