WHATSAPP has announced a brand new feature that will rival Facebook and Instagram in a bid to boost user privacy.

Users will now be able to share photos and videos that disappear from chats after being viewed once.

WhatsApp is part of Mark Zuckerberg’s social media empire which includes Facebook and Instagram and is used by millions worldwide.

The global messaging service will begin rolling out the new ‘View Once’ feature this week.

The update will allow users to share phots and videos that will disappear once opened by the recipient.

The company say the tool is a safe way to share sensitive information without it being saved or repeatedly viewable.

A number of other social platforms, most notably Snapchat and Instagram, already allow users to send messages which disappear after being viewed.

Facebook has also introduced ‘stories’ which works on the same premise.

“While taking photos or videos on our phones has become such a big part of our lives, not everything we share needs to become a permanent digital record,” the messaging firm said in a blog post.

MORE NEWS:

“On many phones, simply taking a photo means it will take up space in your camera roll forever.

“That’s why today we’re rolling out new View Once photos and videos that disappear from the chat after they’ve been opened, giving users even more control over their privacy.

“For example, you might send a View Once photo of some new clothes you’re trying on at a store, a quick reaction to a moment in time, or something sensitive like a wifi password.”

The new feature has been criticised by some however, with children’s safety charity the NSPCC warning it could put young people at even greater risk.

“NSPCC research shows 10% of child sex crimes on Facebook-owned apps take place on WhatsApp but, because they can’t see the content, it accounts for less than 2% of the company’s child abuse reports to police,” said senior child safety online policy officer Alison Trew.

“This View Once feature could put children at even greater risk by giving offenders another tool to avoid detection and erase evidence, when efforts to combat child sexual abuse are already hindered by end-to-end encryption.

“Facebook must spell out how they’ve mitigated these risks, but this is seemingly another example of tech firms failing to consider the protection of children when balancing the rights of users to privacy and safety.

“Regulation must put this right by embedding the principle of safety by design in companies to ensure they consider how changes to products and services will affect their response to the child abuse threat.”