Our mental health is influenced by many social, cultural, environmental and commercial factors, which interact to shape our mental health at an individual and collective level. For instance, the housing we live in, the relationships we have, and our level of education all have a role to play in relation to our mental health.
Our mental health is largely shaped by the circumstances in which we are born, grow, and age in, and, in turn, these circumstances are often heavily influenced by the commercial environment in which we live.1
In this context, the actions of corporations and the strategies they adopt to promote their products or services, have a significant impact on our health. It is therefore important that the power of corporations is subject to appropriate checks and balances to ensure that their behaviours align more closely with the public good.2 This is particularly important to the concept of mental health security – the protection from threats to our mental health.
Health security is understood in a collective way – reducing the vulnerability of communities to harms to health – and an individual way – access to safe and effective products, services and technologies.3 In other words, we cannot expect that public education alone will be enough to improve people’s health, unless we also take steps to protect them from harmful factors over which they have no personal control.
One industry that is growing at a significant rate year-on-year is the global app industry, with an estimated worth of $6.3 trillion by 2021.4 A significant subset of this industry includes image-editing apps, which people can download from application stores (App Store, Google Play), to edit their photos. Many of these apps are free to download but encourage in-app purchases for users to unlock extra features.
We consider it vital that we take action to understand how these apps influence people’s body image and their mental health. It is an industry that is growing largely unchecked, in a space where the potential to negatively impact people’s lives is significant.
Taking action
Based on this accumulating evidence, we propose the following actions:
Body positivity and kindness activists, interested organisations and individuals should engage with the #EverydayLookism campaign
Negative comments about other people’s bodies matter. When we shame bodies, we shame people. These are lookist comments. We no longer put up with sexist comments, we don’t need to keep putting up with lookist comments. Sharing your lookism stories shows how common lookism is, calls it out, and says it’s not ok. Visit the website to find out more and use #EverydayLookism and #BeBodyKind on social media.
Google Play and App Store should update their guidelines for developers to explicitly include ‘mental health’ in the range of harms that are unacceptable
We recognise there will be issues regarding liability. However, guidance on the definition of mental health can take this into account and this can be co-produced with the Mental Health Foundation and other experts. We recommend that at a minimum the guidelines are updated to clearly state that apps should not promote images that are outright lookist, shaming, or triggering of past trauma or eating disorders.
Google Play and App Store should make it mandatory that all body and face image-editing apps are rated as PEGI 12/16 and 13+ respectively, to ensure that children and young people who are below the legal age for having a social media account (13 years old) are not using these apps.
All in-app purchases for additional features should be restricted to people over the age of 18, to ensure predatory promotion is restricted. Currently, only a handful of these apps are restricted in this way, and most have no age restrictions, thus often allowing children as young as five to download and use them.
Research should focus on understanding the features of image-editing apps that are most harmful to body satisfaction and mental health.
Research can take an ethical perspective in defining more clearly the line which determines which image-editing apps or features are acceptable and which are unacceptable because of the high risk they pose to mental health.
Researchers and experts who design services should consider developing new social media literacy training for children and young people.
All training and other programmes should employ a coproduction approach, involving children and young people in their development as well as parents and carers. There appears to be relatively little in the applied research that looks at parents in relation to body image and modelling positive behaviours. Given that parents have a significant influence on the way in which children view their bodies, parents need to be included more in the discussion about image-editing apps.
Everyone should be more aware that if they see an advert in a magazine, on television or online that they think presents an unhealthy body image as aspirational, they can complain to the Advertising Standards Authority.
This includes online or other predatory advertising in relation to image-editing apps. Advertisements that promote these apps to more vulnerable groups, for instance young people belonging to BAME communities, warrant greater scrutiny and investigation.
Body image and mental health
Body image issues can affect all of us at any age and directly affect our mental health. However, there is still a lack of much-needed research and understanding around this.
Learn more about body image and mental health