Social

Facebook Messenger Apps Under Review, Following Cambridge Analytica Data Disaster

By Tinuiti Team

Following the Facebook & Cambridge Analytica data scandal (that continues to get more complex by the day), Facebook announced they will be reviewing their platform policies, including their Messenger / Chatbot program.

UPDATE (5/1/2018): Facebook announces that the app review has reopened. You can read Facebook’s update here.

Credit: www.newsroom.fb.com

On March 21st, Mark Zuckerberg issued a statement to millions of his Facebook followers regarding Cambridge Analytica, who is now in questioning for the misuse of private data on millions of Facebook’s users:

“I want to share an update on the Cambridge Analytica situation — including the steps we’ve already taken and our next steps to address this important issue.”

“We have a responsibility to protect your data, and if we can’t then we don’t deserve to serve you. I’ve been working to understand exactly what happened and how to make sure this doesn’t happen again.”

“The good news is that the most important actions to prevent this from happening again today we have already taken years ago. But we also made mistakes, there’s more to do, and we need to step up and do it.”

You can read Zuckerberg’s entire post here.

Facebook Messenger Apps & Chatbots Under Review

As it relates to Messenger, Facebook is currently pausing app review, which means there will be no new bots/experiences added to the platform while they revisit their current policies.

In a recent announcement, Ime Archibong, VP Platform Partnerships at Facebook said:

“To maintain the trust people place in Facebook when they share information, we are making some updates to the way our platform works. Last week, we announced a number of changes that impact the Facebook developer community.”

“These are critical steps that involve reviewing developers’ actions for evidence of misuse, implementing additional measures to protect data, and giving people more control of their information.”

Ime Archibong, VP Platform Partnerships, Facebook

As Ime mentioned in his post, in the upcoming days Facebook will:

1) Conduct an in-depth review of their platform: Facebook will investigate all apps that had access to large amounts of information before they changed their platform in 2014 to reduce data access, and are conducting a full audit of any app with suspicious activity.

2) Inform people if an app is removed for data misuse: If Facebook finds developers that misused personally identifiable information, they will ban them from their platform. Moving forward, if they remove an app for misusing data, they will notify everyone who used it.

3) Encourage people to manage the apps they use: Facebook already shows people what apps their accounts are connected to and control what data they’ve permitted those apps to use. In the coming month, they are going to make these choices more prominent and easier to manage.

4) Require heightened terms for business-to-business applications: All developers that build applications for other businesses will need to comply with rigorous policies and terms, which Facebook will share in the coming weeks.

5) Reward people who find vulnerabilities: Facebook’s bug bounty program will expand so that people can also report if they find misuses of data by app developers. They are beginning work on this and will have more details as they finalize the program updates in the coming weeks.

How will the pause of Facebook Messenger Apps impact brands?

Although this essentially pauses any Messenger launches for brands / businesses (not currently connected to the chatbot platform), Sarah Sanchez, Manager for Performance Social at CPC Strategy says she is optimistic about Facebook’s future:

facebook audience network
Sarah Rogers, Manager, Performance Social at CPC Strategy

“While this is a short-term loss for companies, the long-term desired effect for Facebook is a more secure platform.”

“I believe the opportunity for chatbots and other apps will be reopened once they’re deemed in-compliance with Facebook’s updated policy standards. In light of the Cambridge Analytica scandal, advertisers, companies, and individuals should all welcome a more secure platform where data won’t be traded freely.”

What if a brand is already using Facebook Messenger / Chatbots?

According to Facebook, all existing Messenger experiences will continue to function as is.

A follow up announcement from ManyChat (a visual bot builder for Facebook Messenger) states:

“If you want to connect a new Facebook page to ManyChat, you’ll have to wait until Facebook updates their policies. Until they do so, you won’t be able to connect new pages to your account. This “pause” affects all bot building platforms including ManyChat. While Facebook hasn’t shared a definitive timeline for this update, we’ve made a bot to notify you when you can start building new bots again.”


Breach of Trust Timeline: Facebook & Cambridge Analytica

Like you, we are also following the whirlwind of confusion surrounding the Facebook & Cambridge Analytica scandal.

According to reports, here’s a brief timeline of events to help you better understand the current status of the social media data disaster:

2007: Facebook enables apps to better connect “friends”

Facebook enables user to log into apps and share who their friends are and some information about them. (This includes a calendar to show your friends’ birthdays, maps to show where your friends live, and an address book to show their pictures.)

2013: Kogan’s quiz app harvests millions of user data

Cambridge University researcher named Aleksandr Kogan created a personality quiz app. It was installed by  300,000+ people who shared their data as well as some of their friends’ data. According to Zuckerberg, given the way Facebook’s platform worked at the time, “Kogan was able to access tens of millions of their friends’ data”.

2014: Facebook limits app data access

To halt and prevent any abusive apps, Facebook announced that we were changing the entire platform to “dramatically limit the data apps could access.” This would prevent apps (like Kogan’s) from being able to harvest friends’ data unless they authorized the app themselves.

2015: Kogan shares data with Cambridge Analytica

According to The Guardian, Kogan shared data from his app with Cambridge Analytica. This was against Facebook’s policy and Kogan’s app was banned from the platform. Facebook demanded Kogan and Cambridge Analytica “formally certify that they had deleted all improperly acquired data.” According to initial reports, they did provide these certifications.

2018: Political scandal & legal penalties

In March, Facebook learned that Cambridge Analytica may not have deleted the data as they had certified. There are now rising reports that Cambridge Analytica, a political data firm hired by President Trump’s 2016 election campaign, “gained access to private information on more than 50 million Facebook users” and that “the firm offered tools that could identify the personalities of American voters and influence their behavior.”

As of March 26, the Federal Trade Commission (FTC) has confirmed it’s investigating Facebook’s  privacy practices in response to the data scandal.

In a statement regarding reported concerns about Facebook’s privacy practices, Tom Pahl, acting director of the Federal Trade Commission’s Bureau of Consumer Protection, said:

“The FTC takes very seriously recent press reports raising substantial concerns about the privacy practices of Facebook. Today, the FTC is confirming that it has an open non-public investigation into these practices.”

To read more from the FTC statement, click here.

If Facebook is found guilty of breaching the 2011 consent agreement to “safeguard users’ personal information” the FTC could fine Facebook up to $40,000 per violation per day (Bloomberg).

If you multiply that by the millions of users on the Platform, Facebook is easily looking at hundreds of millions of dollars in fines.

For more information, email [email protected]

You Might Be Interested In

*By submitting your Email Address, you are agreeing to all conditions of our Privacy Policy.