In a recent interview with Human Resources Online, Fe/male Switch CEO and co-founder of CADChain Violetta Shishkina elaborated on the topic, saying:
I grew up in Russia, where a woman is supposed to get married and spend her days in the kitchen or working a lifestyle job. Now, in the Netherlands, I am a CEO of a legal tech startup, yet most people are shocked to hear that I didn’t come here to get married but to join a start-up incubator. Bias is inherent to all individuals and the best way to fight it is with a systematic approach. Acknowledging that we are biased is the first step to fixing the issue.
Being an avid entrepreneur, I am always looking for scalable solutions to problems: in this case, I simply decided to create a nonprofit game for future entrepreneurs that will not only increase the number of female entrepreneurs, but will also help collect data for a bias score algorithm. Anyone will be able to verify how biased their written speech is.
Violetta is a renowned and respected female leader in the tech industry, whose experience is indeed second to none. She has an MBA and four more master's degrees. She is a professor of economics, an educator, and PR specialist, certified in data science and AI, with knowledge in neuroscience and psychology.
How the Bias Score Algorithm helps to give data ownership back to women
Long story short, the Fe/male Switch and CADChain team has come up with the idea of launching the first initiative that addresses gender inequality within textual data sources and gives ownership to the women who create the underlying datasets or algorithms.
This is with our Bias Score, or BS for short, by using which everyone will be able to verify the bias score of their articles and datasets.
Although many companies have been experimenting with similar AI developments, most datasets are taken from a male perspective or only derived from data generated by men. This results in misinformation and lack of transparency: how different genders experience events, texts, speech, and other forms of data that are seldom taken into account?
Our algorithm evaluates articles, essays, educational content, business pitches and generates a gender bias score, while indicating what needs to be fixed in order to improve the text.
The second problem we address with this algorithm is data ownership. In a world where large corporations dictate how they use your data and do not compensate you, we are losing control over our data.
To tackle this issue, we gamify the creation, annotation, and validation of data and reward for related tasks. Our algorithm can be used to serve as a stamp of approval for a company's data usage and provide transparent insights in their bias scores.
The result of this project will be the technology that tracks the ownership – while respecting privacy – of the female-owned dataset and the algorithm that scores text based on how biased it is. We are cutting the BS out of data.