Apps designed for feminine well being monitoring are exposing customers to pointless privateness and security dangers by means of their poor knowledge dealing with practices, based on new analysis from UCL and King’s Faculty London.
The research, offered on the ACM Convention on Human Components in Computing Programs (CHI) 2024 on 14 Might, is probably the most in depth analysis of the privateness practices of feminine well being apps up to now. The authors discovered that these apps, which deal with medical and fertility knowledge equivalent to menstrual cycle info, are coercing customers into getting into delicate info that might put them in danger.
The workforce analyzed the privateness insurance policies and knowledge security labels of 20 of the most well-liked feminine well being apps accessible within the UK and USA Google Play shops, that are utilized by a whole lot of thousands and thousands of individuals. The evaluation revealed that in lots of situations, consumer knowledge may very well be topic to entry from legislation enforcement or safety authorities.
Just one app that the researchers reviewed explicitly addressed the sensitivity of menstrual knowledge with regard to legislation enforcement of their privateness insurance policies and made efforts to safeguard customers towards authorized threats.
In distinction, most of the pregnancy-tracking apps had a requirement for customers to point whether or not they have beforehand miscarried or had an abortion, and a few apps lacked knowledge deletion features, or made it troublesome to take away knowledge as soon as entered.
Specialists warn this mix of poor knowledge administration practices might pose critical bodily security dangers for customers in nations the place abortion is a legal offence.
Feminine well being apps gather delicate knowledge about customers’ menstrual cycle, intercourse lives, and being pregnant standing, in addition to personally identifiable info equivalent to names and e-mail addresses.
Requiring customers to reveal delicate or probably criminalizing info as a pre-condition to deleting knowledge is a particularly poor privateness apply with dire security implications. It removes any type of significant consent provided to customers.
The results of leaking delicate knowledge like this might end in office monitoring and discrimination, medical health insurance discrimination, intimate accomplice violence, and legal blackmail; all of that are dangers that intersect with gendered types of oppression, significantly in nations just like the USA the place abortion is prohibited in 14 states.”
Dr Ruba Abu-Salma, lead investigator of the research from King’s Faculty London
The analysis revealed stark contradictions between privateness coverage wording and in-app options, in addition to flawed consumer consent mechanisms, and covert gathering of delicate knowledge with rife third-party sharing.
Key findings included:
- 35% of the apps claimed to not share private knowledge with third events of their knowledge security sections however contradicted this assertion of their privateness insurance policies by describing some stage of third-party sharing.
- 50% offered express assurance that customers’ well being knowledge wouldn’t be shared with advertisers however have been ambiguous about whether or not this additionally included knowledge collected by means of utilizing the app.
- 45% of privateness insurance policies outlined an absence of duty for the practices of any third events, regardless of additionally claiming to vet them.
Lots of the apps within the research have been additionally discovered to hyperlink customers’ sexual and reproductive knowledge to their Google searches or web site visits, which researchers warn might pose a threat of de-anonymisation for the consumer and will additionally result in assumptions about their fertility standing.
Lisa Malki, first writer of the paper and former analysis assistant at King’s Faculty London, who’s now a PhD scholar at UCL Pc Science, mentioned: “There’s a tendency by app builders to deal with interval and fertility knowledge as ‘one other piece of knowledge’ versus uniquely delicate knowledge which has the potential to stigmatise or criminalise customers. More and more dangerous political climates warrant a larger diploma of stewardship over the security of customers, and innovation round how we would overcome the dominant mannequin of ‘discover and consent’ which at the moment locations a disproportionate privateness burden on customers.
“It’s critical that builders begin to acknowledge distinctive privateness and security dangers to customers and undertake practices which promote a humanistic and safety-conscious method to creating well being applied sciences.”
To assist builders enhance privateness insurance policies and practices of feminine well being apps, the researchers have developed a useful resource that may be tailored and used to manually and robotically consider feminine well being app privateness insurance policies in future work.
The workforce are additionally calling for vital discussions on how a majority of these apps – together with different wider classes of well being apps equivalent to health and psychological well being apps – take care of delicate knowledge.
Dr Mark Warner, an writer of the paper from UCL Pc Science, mentioned: “It is essential to recollect how necessary these apps are in serving to ladies handle completely different features of their well being, and so asking them to delete these apps is just not a accountable resolution. The duty is on app builders to make sure they’re designing these apps in a approach that considers and respects the distinctive sensitivities of each the info being straight collected from customers, and the info being generated by means of inferences constructed from the info.”
Supply:
Journal reference:
Malki, L. M., et al. (2024). Exploring Privateness Practices of Feminine mHealth Apps in a Publish-Roe World. CHI ’24: Proceedings of the CHI Convention on Human Components in Computing Programs. doi.org/10.1145/3613904.3642521.