Support The Ethics Engine
Teaching systems to stay accountable and people to stay informed.
Public intelligence for the age of AI.
Your support keeps public-interest reporting, analysis, and education open to everyone.
Our Why
We started FairByDesign at a time when automated systems were already shaping credit, work, health, and speech, long before most people understood how they worked or who they served. Behind every automated decision lies a network of incentives, interests, and algorithms that steer outcomes without public consent.
Our work proceeds from a simple premise: ethical AI is not a future debate; it is a present responsibility. Governance cannot live only in policy papers or technical standards; it must become public knowledge anyone can use.
This page continues that mission: to document, explain, and teach the structures of accountability behind AI and automation.
What We Do
- Publish open-access explainers and briefs on AI governance and accountability
- Host conversations that connect policy, risk, and engineering to everyday decisions
- Run workshops that turn frameworks into practice for teams and communities
- Develop visual field guides and practical checklists for secure and privacy-by-design
Cofounders
Richard Ralston Cofounder
Technical Governance & Implementation
I came to this work from a simple frustration: accountability is talked about everywhere, but almost nowhere is it taught in a way that people can actually use. I’ve spent years inside the technical and operational layers of systems, watching good intentions fall apart because no one translated the “why” into a working “how.” That’s why I’m here. I believe that if people understand the mechanics of a system — what triggers it, what constrains it, what blinds it — they can make better decisions, build better technology, and shape better outcomes. Through FairByDesign and The Ethics Engine, I focus on helping teams move from theory to practice, one clear step at a time.
Viktoria Bakos Cofounder
Governance Strategy & Societal Impact and Public Understanding
My background sits at the intersection of governance, cybersecurity, and lived experience. FairByDesign emerged from a simple but urgent need: people deserve to understand the systems deciding parts of their lives. I focus on translating AI governance into practical, accessible knowledge — aligning policy, risk, and engineering with the realities people face every day. I work on the structural side: how incentives, regulations, data practices, and design choices create real-world impacts. My goal is to make accountability operational and understandable. Through The Ethics Engine, I create frameworks and explanations that help the public see how AI systems work, where they fail, and how we can build better ones.
Our Team
A distributed group of researchers, editors, technologists, and educators with a shared commitment: make responsible AI a public standard, not a niche specialty.
- Research & analysis across policy, risk, and engineering
- Editorial quality: accuracy, accessibility, neutrality
- Education: workshops, visuals, practical guides
Why Your Support Matters
Independent work requires independence of funding. Support sustains open content, improves accessibility (captions, translations), and expands community workshops.
Every contribution keeps the doors open for people who need trustworthy information now.
Why Now
AI is already embedded in decisions that affect rights, safety, and opportunity. Standards and rules are evolving quickly while public understanding lags. People deserve clear language, practical steps, and a place to ask hard questions.
That is the role of this project: public-interest reporting and education that keep systems accountable and people informed.
Open-Access Briefs
Weekly explainers that connect policy, risk, and engineering with everyday language and real cases.
Workshops & Tools
Live Q&A workshops and practical templates that turn frameworks into decisions and actions.
Media & Dialogue
Podcast and video discussions that bridge expert knowledge with public concerns and accountability.
Patron Tiers
Ally
- Help keep content open and independent
- Early access to selected essays and episodes
- Monthly mission notes from the team
- Optional name listing as Ally
Advocate
- Everything in Ally
- Behind-the-scenes project updates
- Invites to occasional feedback sessions
- Name included in the FairByDesign Circle
Best fit for people who want to engage, not just observe.
Co-Creator
- Everything in Advocate
- Priority for roundtable and co-content invitations
- Recognition (with your approval) as a key supporter
- Opportunities to help shape future projects
Most patrons choose Advocate when they want to actively support the work and be part of its direction.
Join Us — Keep Accountability Human
Your support powers independent reporting, education, and community learning on AI governance, privacy, and security.