F’xa – Your AI powered feminist guide to AI Bias


EY & Feminist Internet.

The Brief

Create a feminist, playful and engaging user experience that will teach attendees at EY’s Innovation Realized conference about bias in AI.

What we did

With ten days to bring something to life, we engaged in a rapid prototyping approach with the Feminist Internet team to achieve the aims of the brief.

We looked at exploring conversational interfaces, especially with a focus on voice however due to the short period of time and technical limitations, we opted for an interactive web conversational experience.

We decided to build F’xa, an AI powered chatbot that teaches people about AI bias, and makes suggestions on ways the impact of bias occurring in AI systems can be limited.

The AI bias conversation is one gaining increasing attention on AI bias in mainstream media, and the AI Now Initiative’s recent report, which highlights the urgency of tackling the problem from an interdisciplinary perspective. The Feminist Internet team felt that now is a great time to introduce F’xa to the market and to the public.

Our work was heavily informed by the Feminist Internet’s Personal Intelligent Assistant standards and Josie Young’s Feminist Chatbot Design Process. These guidelines help designers ensure they don’t knowingly or unknowingly perpetuate gender inequality when building chatbots.

Here are some examples of design decisions that were informed by these guidelines:

    • F’xa never says ‘I’. It is challenging to avoid this little word when designing conversations, but F’xa’s designers did so in recognition of the complex emotional attachments people can form to bots that are designed to feel very human. Not using ‘I’ keeps people aware that they are not talking to someone or something with a consciousness of its own.
    • F’xa gives definitions of AI and feminism from people with different races, genders, gender identities and ways of thinking, recognising that such definitions are culturally situated. As Feminist Internet’s motto goes – ‘there is no feminism, only possible feminisms’
    • F’xa uses a range of skin tones in its emojis, to acknowledge its voice as something multiplicitous.

F’xa looks at three areas where AI bias can appear – search engines, hiring algorithms and voice assistants.

It takes people on a journey through any or all of these pathways, and finishes with suggestions about what they can do about AI bias, as well as offering a chance to reflect on which calls to action resonated most.


The F’xa Project was well received by the attendees of EY’s Innovation Realized Conference and it would possibly be at other EY events during the year.

F’xa has also started to get press attention, F’xa has been featured in Evening Standard, Dazed, Vox, The Next Web and It’s Nice That.

To have a conversation with F’xa – go to https://f-xa.co/.

For more information go to http://about.f-xa.co/.

Socials Twitter.Instagram.

This is a unique website which will require a more modern browser to work!

Please upgrade today!