It’s a company’s public relations nightmare.

On Sunday, 19 March 2018, The New York Times and The Guardian reported that a political data firm, Cambridge Analytica, had harvested the data of more than 80 million Facebook users for the benefit of President Donald Trump’s 2016 election campaign.

The fallout from the resulting media frenzy was disastrous for Facebook. The company’s stock had tumbled 8.5 percent through the following Wednesday’s close and losses of 50 billion dollars were reported – the damage attributed primarily to viral reporting of the scandal over all media outlets as well as the subsequent #DeleteFacebook protest on Twitter.

It may become the seminal case study that demonstrates the dramatic reputational and share value impact that can be caused by ‘getting privacy wrong’.

But how did it go so wrong?

And what can your organisation do to get privacy right?

We’ll take on these questions and show you how to avoid finding your business involved in a similar crisis.

How and Why Was the Facebook Data Obtained?

It all began with a personality quiz. Back in 2007, a couple of Cambridge University psychologists from its Psychometrics Centre developed a Facebook app called myPersonality.

The purpose, according to a report in The Guardian, was to study the personality traits of users who consented.

Data obtained from the personality quiz included age and gender, and provided information that would connect personality traits with Facebook users’ ‘likes’.

The psychologists published their findings in 2013 in the Proceedings of the National Academy of Sciences, where it was seen by Christopher Wylie, a data expert at Cambridge Analytica.

Wylie would later become the whistle blower who brought the data mining scandal to light.

Cambridge Analytica is owned by the Mercer family, described in news reports as right-wing billionaires with an interest in swaying political outcomes. Wylie tried to obtain the right to use the myPersonality app but was unable to strike a deal.

Instead, he engaged Aleksandr Kogan, a Cambridge psychology professor, to create his own app, ‘This Is Your Digital Life’.

According to news reports, Kogan paid a small fee to about 270,000 people to install his app in their Facebook accounts and fill out a survey. By downloading the app, these users agreed to let data about them be harvested, such as their locations, job and educational histories, and Facebook page ‘likes’.

They were told that their information was for academic purposes.

What Went Wrong?

Facebook’s terms of service at that time allowed third-party app developers to not only harvest data about app users, but also obtain access to information about all the users’ Facebook friends as well, if the friends’ privacy settings allowed it.

These friends were not informed that their data was being taken. It is likely that most Facebook users were not even aware that their privacy settings could ever allow this type of harvesting.

Kogan ended up with information gathered from a whopping 50 million Facebook users. According to media reports, he then passed the data he’d obtained to Cambridge Analytica, which was allegedly a violation of Facebook policy.

Facebook says it prohibits data gathered in this manner to be sold or transferred for commercial purposes, which is what Kogan allegedly did by giving the data to a political consulting company.

The Guardian reported that Cambridge Analytica then combined this information with voter records and other data sources to create psychological profiles so they could target potential voters with more effective political advertising.

The company has since changed its terms of service to prevent the kind of friend – harvesting allegedly used by Kogan’s app – but there’s more to the story that continues to haunt Facebook from a PR standpoint.

The allegations against Facebook and Cambridge Analytica are now being investigated by privacy regulators in several countries to determine whether further regulatory action, including penalties such as fines, should be taken.

The predicament currently facing Facebook is likely to have a ripple effect in the broader data marketplace.

Companies need to be aware of what these repercussions may be to avoid a similar crisis.

Beware the Creep

What happened in the Cambridge Analytica scandal is a prime example of the risks of ‘function creep’.

In the privacy space, the notion of function creep is extended to describe the creep away from the original purpose for data collection, such as a situation where a customer gives the company data for one purpose and at some point, the data is used for something the individual never anticipated.

Privacy regulators and advocates have been warning against the dangers of function creep for years. Now there is a high-profile case which powerfully illustrates the consumer backlash that could result if a company fails to guard against function creep.

From a business perspective, consumer suspicion which may follow in the wake of the Cambridge Analytica scandal could have the following impacts:

  • Consumers may avoid dealing with your company if you appear to collect unnecessary personal information, or if you are vague or unclear about proposed future uses of your data.
  • Consumers may become increasingly more aware of future uses which they did not anticipate and cautious about engaging with companies for this reason.
  • Media coverage of particularly suspect function creep may increase, putting a spotlight on commercial privacy issues.
  • Contractors, suppliers and other external clients may want further reassurance from you that your company’s privacy compliance obligations are being met.
  • Shareholder wariness about the impact of allegations of function creep or data harvesting may increase, requiring increased vigilance by companies to avoid it.

Consumers have grown complacent about function creep, begrudgingly accepting it as a necessary risk of participating in a digital marketplace. The wake up call provided by the Facebook/Cambridge Analytica scandal will likely result in more vigilance by consumers and companies—and possibly even stricter regulation by governments.

5 Steps To Avoid A Data Use Scandal

1 – Only collect the data you need.

Many privacy compliance or risk exposure issues can be addressed by only collecting the data that you need to deliver your services or operate your business. Versions of this simple premise are already incorporated in numerous data protection regimes (including the Australian Privacy Principles) but it is still common for companies to collect data ‘just in case’. Customers are likely to look unfavourably at this approach post the recent data harvesting scandal.

2 – Clearly (and briefly) explain how you use data.

Keep your Privacy Policy simple and to the point. Be clear about why you are collecting data, how you intend to use it and who you will disclose it to. If you can’t clearly set these points out in a brief document then you might need expert help to reassess your company’s data handling practices and privacy compliance measures. (see tip 5)

3 – Beware the creep.

Be mindful of function creep and avoid creeping too far from the original purpose of data collection. Revisit what data uses your customers would have anticipated at the point of collection whenever you are considering any new uses of the data. If your company is exploring new uses for the data it holds, a Privacy Impact Assessment can help with avoiding function creep and mitigating the types of risks encountered by Facebook.

4 – Have an ‘ambulance service’ for when things go wrong

Having a plan for what your company will do in the event of a data breach, or another privacy PR issue, is critical. In addition to a Data Breach Response Plan, your company governance tools should include a robust Privacy Management Framework that you can rely on and consult when faced with data handling issues. A Privacy Management Framework includes analysis of your current governance, systems and processes against privacy compliance requirements and identifying and mitigating key areas of privacy risk.

5 – Consult with experts

Peripheral Blue has the specific expertise to assist in this area, with team members who have strong backgrounds working in privacy and Freedom of Information for government regulators and departments, and in the private sector.

We offer a range of privacy consulting services including formulating Data Breach Response Plans, conducting Privacy Impact Assessments, and undertaking Privacy Audits.

Taking these 5 steps to build trust in your organisation’s data handling now will help your company avoid a similar PR and compliance crisis to that triggered by the Cambridge Analytica scandal.

There’s no time to lose. The Notifiable Data Breaches scheme commenced in Australian in February, and The European Union’s GeneralData Protection Regulation comes into force in May 2018.

Data Protection Regulation comes into force in May 2018

Call us today to find out how we can help you with privacy compliance and ensuring consumer and shareholder confidence in your data handling.

While the recent Facebook scandal highlights the commercial impact of getting privacy wrong, the flip side is that it further confirms the value in getting privacy right.

We can help you to do just that.


Products & Services


1300 774 788

© 2019 Peripheral Blue | All Rights Reserved | ABN 61855198272                                     Privacy Policy Terms & Conditions

Products & Services


1300 774 788

Suite 17, 116-120 Melbourne St, Nth Adelaide, SA 5006

© 2023 Peripheral Blue | All Rights Reserved
ABN 61855198272

 Privacy Policy Terms & Conditions