We got involved with data analytics and machine learning and are creating a Bias Score (BS) algorithm. It addresses gender inequality within textual data sources and gives ownership to the women who create the underlying datasets or algorithms.
The problem that fuels misinformation and lack of transparency lies in ignoring how different genders experience events, texts, speech, and other forms of data. Most datasets are taken from a male perspective, which results in gender bias being embedded in almost every form of written content.
Our algorithm will serve as a stamp of approval for a company’s data usage and provide transparent insights (while respecting privacy) in their BS.
We are a team of experts in journalism, IP, law, data science, AI, ML, blockchain, psychology, business, linguistics, etc. Our team is gender-balanced, with non-binary individuals on board. Neurodiversity is something we openly value as well.
Bias Score is part of the MediaFutures project.
This project has received funding from the European Union’s framework Horizon 2020 for research and innovation programme under grant agreement No 951962.