Bumble bans body-shaming comments
28th January 2021

Unfortunately, prejudices can play out on dating apps.

Whether evident on profile bios or in exchanges with users, some people have been subject to fatphobic, racist, ableist communication and other derogatory language.

To put a stop to this, dating app Bumble has vowed to ban any users who make body-shaming comments.

The women-first app has updated its terms and conditions to explicitly ban unsolicited and derogatory comments made about someone’s appearance, body shape, size or health.

This includes language that can be deemed fatphobic, ableist, racist, colourist, homophobic or transphobic. 

From today, Bumble profiles that include body-shaming language will be moderated, as well as any body-shaming comments that are made through the app’s chat function. 

The app uses automated safeguards to detect comments and images that go against its guidelines and terms and conditions, which are then escalated to a human moderator to review. 

People who use body-shaming language in their profile or through the app’s chat will first receive a warning for their inappropriate behaviour.

If there are repeated incidents or particularly harmful comments, Bumble will permanently remove the person from the app. 

The app is also offering people the chance to learn more about why their comments are offensive.

Moderators will share resources that are intended to help the individual learn how to change their behaviour to be less harmful to others in the future.

Bumble is encouraging its community to report bad behaviour through their in-app reporting to enforce this specific set of guidelines.  

The move comes after new research from Bumble shows how people in the UK experience body-shaming and the impact this has on their lives.

The stark findings showed one in four (23%) Brits have been body-shamed online on a dating app or social media.

Half of the people surveyed said that someone they have dated has made an unsolicited comment about their body either in person or online.

Naomi Walkland from Bumble said: ‘We have always been clear on our mission to create a kinder, more respectful and more equal space on the internet. Key to this has always been our zero-tolerance policy for racist and hate-driven speech, abusive behaviour and harassment.’

In 2019, Bumble introduced Private Detector, a feature that uses AI to automatically detect and blur unsolicited nude images. It then alerts the recipient who can choose to view, delete or report the image. 

Do you have a story you want to share?

Email [email protected] to tell us more.

Rush Hour Crush – love (well, lust) is all around us

Visit Metro’s Rush Hour Crush online every weekday at 4:30pm.

Tell us about your Rush Hour Crush by submitting them here, and you could see your message published on the site.

Source: Read Full Article