Facebook and Instagram encryption plans delayed by Meta until 2023: Check More Details

Plans to roll out end-to-end encryption on Facebook and Instagram have been delayed amid a row over child safety.

Meta – as Facebook’s parent company is now called – said messaging encryption on the apps would now come in 2023.

The process means only the sender and receiver can read messages, but law enforcement or Meta cannot.

However, child protection groups and politicians have warned that it could hamper police investigating child abuse.

The National Society for the Prevention of Cruelty to Children (NSPCC), has claimed that private messaging “is the front line of child sexual abuse”.

UK Home Secretary Priti Patel has also criticized the technology, saying earlier this year that it could “severely hamper” law enforcement in

pursuing criminal activity, including online child abuse.

Privacy v Protection

End-to-end encryption secures data as it moves between phones and other devices by scrambling or encrypting it.

Typically, the only method to read the message is to gain physical access to the device that sent or received it, which must be unlocked.

Facebook and Instagram

The popular messaging application WhatsApp, which is also owned by Meta, uses the technology by default; however, the other apps made

by the business do not.

The issue is also a concern for Ofcom, the communications regulator tasked with enforcing the online safety bill, which will become law

around 2023 and imposes a duty of care on tech companies to protect children from harmful content and prevent abuse from occurring on

their platforms. Ofcom’s chief executive, Melanie Dawes, told the Times on Saturday that social media companies should ban adults from directly messaging children or face criminal sanctions.

See also  Top 5 Key Steps to Improve Your IT Staffing Hiring And Orientation Process

To obtain a breakdown of the platforms used to perpetrate sexual offenses against minors last year, the NSPCC issued Freedom of

Information requests to 46 police forces in England, Wales, and Scotland.

READ MORE:  How To Get and Use the Heart of The Sea in Minecraft: Make a Conduit In The Game

The results showed:

Police received more than 9,470 reports of internet child sex offenses and child sex abuse incidents.

52 percent of these happened on apps owned by Facebook.

More than one-third of the incidents happened on Instagram, while 13% did so on Facebook and Messenger and very few did so on WhatsApp.

This has raised concerns that Meta’s intentions to add encryption to popular messaging apps like Facebook Messenger and Instagram would

prevent the bulk of offenders from being found.

About The Safety Concerns:

Antigone Davis

The head of safety at Facebook and Instagram’s parent company, Meta, announced that the encryption process would take place in 2023. The company had previously said the change would happen in 2022 at the earliest.

“We’re taking our time to get this right and we don’t plan to finish the global rollout of end-to-end encryption by default across all our

messaging services until sometime in 2023,” Antigone Davis wrote in the Sunday Telegraph. “As a company that connects billions of people

around the world and has built industry-leading technology, we’re determined to protect people’s private communications and keep people safe online.”

According to the NSPCC, encrypting messages by default could make it easier for child abuse imagery or online grooming to proliferate.

See also  All Flipkart Users Are Advised to Change Their Passwords as Security Experts Showed Concern About Their Privacy - Market Trends & Business Updates

However, proponents contend that encryption safeguards users’ privacy and thwarts snooping by both governments and dishonest hackers.

Mark Zuckerberg, the CEO of Meta, provided such justifications when he revealed Facebook’s encryption intentions in 2019.

‘Getting it Right’

The delay in introducing encryption until 2023, according to Meta’s global head of safety Antigone Davis, is the result of the firm taking its time “to get this right.”

The business has previously stated that the modification will take place no later than 2022.

As a firm that links billions of people globally and has created market-leading technology, Ms. Davis declared: “We’re determined to secure

people’s private conversations and keep people safe online.”

She also listed several additional precautionary steps that the business has already taken, such as:

“Proactive detection technology” that looks for unusual patterns of behavior, such as a person who frequently creates new profiles or

messages a lot of strangers

putting users under the age of 18 into private or “friends only” profiles by default and preventing adults from texting them if they aren’t already linked

Providing in-app advice on how to avoid inappropriate contact with young people

READ MORE:  Viral What’s App Message Offering Free Huawei Mate 40 Pro After Completion of A Survey Is Fake

The NSPCC’s Andy Burrows, director of online child safety policy, applauded Meta’s postponement.

“They shouldn’t move through with these measures until they can show they have the technologies in place to assure children will not be at

an increased risk of abuse,” he said.

“Facebook must now demonstrate that they are serious about the child safety risks and not just playing for time while they weather difficult

See also  The Cranberries' Zombie Is Viewed More than One Billion Times on You Tube

headlines, more than 18 months after an NSPCC-led global coalition of 130 child protection organizations raised the alarm over the danger of end-to-end encryption,” the coalition stated.

Leave a Comment