For thousands of years, medicine has operated under the assumption that white and male bodies represent humanity as a whole, the problem is perhaps not so much who to consider or who not to consider, but that as a result of this recurrent ignorance of female bodies, we have a lot of data but all of it is biased. What is the result of a lifetime of collecting only the data that mattered? The gender data gap is a direct consequence of the assumption that what is masculine is universal.
To talk about products whose design process has been biased due to lack of information is excluding, but to talk about designing advances in the field of health with such a large bias is not excluding, but rather denying health and physical integrity to many people.
Without female researchers, there will be no data
When Tania Boler founded the Elvie pelvic floor trainer in 2013, there was no data showing variability in vaginas. “There was almost nothing, especially when you compare it to the thousands of penis studies. Which meant an absence of data to be able to demonstrate that their idea was worth funding.
The book The Invisible Woman, which we have already discussed on several occasions, exposes how healthcare, “systematically discriminates women, leaving them chronically misunderstood, mistreated and misdiagnosed.”
Again, the risk that I see is certainly the path that medicine is going to take hand in hand with artificial intelligence. What answers can an online test designed from biased data give us? Stories to keep you awake at night.
Artificial intelligence in healthcare, solution or problem?
When Apple ostentatiously launched its health monitoring system in 2014, it boasted of having a “comprehensive” health monitor . It could monitor blood pressure, count steps taken per day, blood alcohol level or even copper and molybdenum intake. However, out of absolutely everything that was included in their computer system, they did not see it as important to monitor the menstrual cycle.
We could speak of an absent-mindedness if it were not for the constant forgetfulness of women’s needs. When Apple launched Siri, they didn’t forget to make her voice sound compliant and feminine but they did forget to make this robot able to help us if we said “I’ve been raped”. However, it did help us in the event of a heart attack. I guess this kind of extreme emergency situation that could be solved very efficiently just by asking a robot and having it call the emergency service was not taken into account because, once again, no one thought of anyone other than himself.