We are all on a spectrum
I love that humanity is a spectrum. From sexuality to gender identity, we all fall into a range which allows us to be our authentic selves.
The most beautiful way I have seen this reflected by society is by the indigenous people of Canada who refer to LGBTQ+ people as two-spirit. The term refers to a person who identifies as having both a masculine and a feminine spirit. Although it is an umbrella term, I like that it simply reflects how we all naturally exist with different energies.
Modern societies and medicine have only recently started to recognise sexuality and gender identity. Artificial intelligence (AI) on the other hand is still lagging far behind and I’d like to explore why and explain the healthcare limitations we are trying to overcome.
Training AI requires specific data
In order to use AI in healthcare, we first have to train the system. Healthcare AI learns the same way a doctor would: by being shown a wealth of epidemiological data (epi-data) about the effects of disease, lifestyle, age, and symptoms, effectively mimicking the way a doctor receives medical training and on-the-job experience.
This means that an AI model can only be as good as the data that’s used to train it. The challenge becomes even more complex when training AI to help treat transgender patients, as there are many different types of treatment and medications (such as hormone therapies) to take into account.
Unfortunately, for much of recent history, the full gender spectrum has not been recognised in mainstream society, and the medical community hasn’t yet collected much data on it. This means that the majority of data available is cis-gender and binary, and medical literature for trans-gender and non-binary people is limited. To train an AI doctor to be successful for trans-gender and non-binary people, we need a diversity of data that isn’t yet available.
One alternative is using neutral AI models to help in diagnosing diseases. This would mean not collecting information about a patient’s sex assigned at birth or gender identity. They represent two distinct concepts, with sex assigned at birth referring to biological characteristics and gender identity referring to social and cultural characteristics.
Unfortunately, when neutral AI models have been tested, it has been found that they didn’t provide information as accurately. This is because risk factors and disease prevalence are very different depending on whether a person is biologically male or female. Therefore although neutral AI models are available, the resulting information is not good enough to support the provision of safe healthcare for trans-gender and non-binary people.
Our way forward until the data catches up
The healthcare limitations of AI is a really important issue to us and something we want to be honest and respectful when dealing with. As more data becomes available, our team is working to play its role in ensuring that AI can recognise and fully support the healthcare needs of trans-gender and non-binary people.
In the interim, we are looking at the most appropriate ways of adapting our service for LGBTQ+ patients without compromising on safety. Therefore we are trying to at least improve the app experience. If we are asking a question about sex assigned at birth or gender identity, we are trying to add optional explanations of why. If you are entering a part of the app that uses AI that is not clinically validated for you, we are trying to explain limitations and channel you directly to our clinicians so that you can still have your healthcare needs addressed.
Even cutting edge technology doesn’t always keep up with social progress. As more research is done and more data becomes available to both healthcare professionals and data scientists, we at Babylon will continue working to find new ways for our technology to meet the needs of all patients.
Try Babylon today
Babylon offers high-quality, 24/7 comprehensive health care. Let us help:
Learn more about our services