The Price of Profits: A Brief History of Facebook’s Privacy Practices

Unveiling the dark side of Meta (formerly Facebook), revealing privacy practices that have inflicted harm on both users and shareholders. From alarming instances of data misuse to the promotion of social polarization, we shed light on the shocking truth behind Facebook's tumultuous journey. Brace yourself for an eye-opening read that exposes the underbelly of privacy controversies.

Facebook, founded in 2004 as an “online directory that connects people through social networks at colleges,” has grown to become the most widely used social media platform in the world. With over 2.96 billion monthly active users as of the third quarter of 2022, Facebook’s influence is undeniable. However, the company has faced significant setbacks and controversies that have impacted its reputation and raised concerns about its privacy practices.

Similar to companies like Google and TikTok, Facebook’s revenue strategy revolves around the collection, processing, and utilization of vast amounts of behavioral data. This data is used to generate predictions and deliver targeted advertisements across Facebook’s platforms, apps, and even third-party websites. Moreover, Facebook leverages this data to offer personalized experiences, connect users with people and organizations they care about, empower communication and self-expression, and facilitate content and product discovery (Facebook’s Terms of Service, 2022).

The data collected by Facebook include information willingly shared by its users, such as age, gender, contact details, social connections, education, employment, and uploaded photos and videos. Additionally, Facebook gathers data through metadata derived from user posts and devices, including location, date, time, and device type. By tracking users’ online behavior through cookies, Facebook collects further data on browsing activity, interactions, search history, and purchases, both on and off its platforms, even for individuals without Facebook accounts (About Meta Pixel, 2022).

Furthermore, Facebook employs artificial intelligence (AI) to infer additional data from users’ online behavior, such as likes, follows, interactions, searches, and geolocation. This analysis can reveal demographic attributes, personality traits, and even predictions related to consumption needs, political preferences, or sexual identity (Hinds et al., 2020; Wylie, 2019; Rosen, 2013). The company has also explored unconventional methods, including wireless signals and camera lens patterns, to gather more information about users (Thomas, 2018; Golbeck, 2020).

A couple of company patents suggest that it could even figure out acquaintances using wireless signals of other devices around you (Thomas, 2018) or looking for dust patterns and scratches on your camera lens (Golbeck, 2020). Then, their facial recognition feature identified people who appeared in users’ photo albums and suggested users “tag” them all with a click, linking their accounts to the images (Hill & Mac, 2021). However, amid a wave of criticism over the use of facial recognition to monitor citizens in China, the UK, and other countries, this feature was shut down in 2021, and they announced the erase of individual facial recognition templates for over a billion people (Pesenti, 2021). 

Using the data the company collects, the platform’s algorithm creates a profile for each user, even if they do not have a Facebook account (Brandom, 2018). Facebook manages to do this using cookies and IP addresses in a figure called “shadow profiles.” This data is used to build recommendations and predict user behavior. In this way, Facebook suggests people you might know, invites you to participate in groups you might be interested in, and displays personalized news, products, or services, even outside Facebook, based on what its algorithm infers are your preferences. 

Facebook’s algorithm is “optimized to engagement,” which means that the more you “engage” with a specific kind of post, the more you will see its ilk (Evans, 2017). The longer users stay, the more the company earns by offering relevant, user-targeted ads based on their habits, mood, or purchase intentions (Daza & Ilozumba, 2022).

These features can be helpful for its users. For instance, using your current location allows the platform to recommend people you know when they prompt a name in the search box rather than hundreds of namesakes from other places. Or to present relevant ads by analyzing your browsing behavior. Using personal data to customize the users’ experience allows Facebook to offer a better service for its users and clients, thus, increasing its profits.

However, some problems came to light in 2018 when Christopher Wylie, a whistleblower, revealed documents showing how Facebook allowed data extraction from about 87 million users without their consent. This data was used by the firm Cambridge Analytica to send targeted propaganda in an attempt to influence the 2016 election in the US and the Brexit referendum in the UK (Confessore, 2018; Wylie, 2019). One of the company’s most significant public relations crises ensued, resulting in the departure of some of its top executives (Rodriguez, 2018) and the #DeleteFacebook movement on Twitter.  

Then, a couple of weeks later, the New York Times outed Facebook’s executive Andrew Bosworth’s “Growth at Any Cost” memo from 2016 in which he wrote, “Someone may die in a terrorist attack coordinated on our tools. And still we connect people. …anything that allows us to connect more people more often is *de facto* good.” This leak evidenced the lack of interest in the welfare of its users on the part of (at least one of) the company’s executives.

The value of Facebook dropped by $119 billion in one day after the scandal was revealed, which was, until then, the biggest ever one-day drop in a company’s market value (Neate, 2018). And although the effect of the scandal on Facebook’s stock value was a year later recovered, it would cause repercussions in its internal organization. In November 2018, Facebook created the Oversight Board, a body of independent experts to oversee the company’s decisions regarding content and freedom of speech. Additionally, in an attempt to reduce Facebook CEO Mark Zuckerberg’s control over decisions affecting users’ privacy, the Federal Trade Commission of the US imposed a 5 billion dollar fine and demanded the installation of an independent Privacy Committee for the company, which was installed in 2020.

Furthermore, in September 2021, the Wall Street Journal published “The Facebook Files.” Based on a cache of internal documents leaked by Frances Haugen, a former Facebook employee. Haugen declared in a television interview, “The version of Facebook that exists today is tearing our societies apart and causing ethnic violence around the world” (Paul & Milmo, 2021), and accused the company of putting profit before the public good. 

These documents exposed, among other issues, that the company’s own research showed that Instagram’s algorithm has contributed to teens’ mental health struggles (Milmo & Paul, 2021). Moreover, the company was fully aware of the harm it does (Levy, 2021) yet has privileged its own interests over the welfare of the people. Facebook and Instagram were using their users’ data to exploit a psychological vulnerability to keep them on the platform as long as possible, despite knowing the damage this caused and having the possibility to change it. 

The behavior in question has been deemed the standard operating procedure for the company, as evidenced by the statements made by Sean Parker, the initial CEO of the company, during a 2017 interview. Parker’s declarations are indicative of the company’s long-standing practices and protocols: 

…it was all about how do we consume as much of your time and conscious attention as possible, and that means that we need to sort of give you a little dopamine hit every once in a while because someone liked or commented on a photo or a post or whatever… because you’re exploiting a vulnerability in human psychology. (Parker, 2017)

Unlike the Cambridge Analytica scandal, the ‘Facebook Files’ showed a problem of negligence whose responsibility rested squarely on the shoulders of the world’s largest social media company.

Soon after, Mark Zuckerberg announced a name change for the company and an investment of 10,000 million in his new commitment to what they have called the successor of the mobile internet: the metaverse. However, this rebranding strategy did not prevent the company’s value from declining. The company’s value plummeted in February 2022. More than $230 billion was lost in one day, the most extensive single-day loss by a company in its history (Rinehart, 2022).

This change in value was driven by its Q4 2021 earnings report. This showed the effects on the company’s revenue of Apple’s iOS 14.5 update, released in April 2021, from which mobile apps are forced to ask users for permission to allow them to collect tracking data. Some studies suggest that 80% of users have opted out (Laziuk, 2021). The CFO estimated the revenue losses from this Apple-driven change at $10 billion (Rinehart, 2022).

Meta has therefore focused its efforts on establishing strategies that will enable it to recover the value of its shares and even survive. This included closing its cryptocurrency money-transfer service project and, more recently, carrying out a massive layoff. In November 2022, the company fired 11,000 people or 13% of its staff. And a little over a year after its rebranding to Meta, in January of 2023, the company’s share price dropped over 65%, destroying more than half a trillion dollars of market value[1].

The scandals stemming from the media leaks and the multiple fines imposed by the United States and the European Union on Facebook for violating data protection laws have led to a decline in people’s trust in the company. One of the reasons for the company’s difficulties is that the leadership has placed a higher priority on maximizing shareholder profits over the well-being of stakeholders, specifically in terms of protecting user privacy.

I conducted a comprehensive analysis of all the fines and settlements that Facebook/Meta has faced for privacy protection violations from its inception until 2023. My analysis revealed that the company has paid a total of $8.8 billion in monetary penalties and non-financial sanctions for violating privacy protection laws in both the US and the EU. 

To provide a more detailed understanding of the scope and impact of these violations, I compiled them into the following table, which summarizes the specific violations, the amount of each penalty, and its proportion to the company’s annual revenue for that year. The table highlights the severity of the financial impact of these privacy violations on the company over the years.

YearProductIssuing instanceFine in USD% Annual Net IncomeIssue related
2011FacebookData Protection Commissioner for the state of Hamburg $                      –   0.0000%Dark Patterns & Consent
2011FacebookFederal Trade Commission (FTC) $                      –   0.0000%Dark Patterns & Consent
2012FacebookUS District Court fot the Northern District of California $          9,500,000 4.1485%Dark Patterns & Consent
2017Facebook/WhatsappEuropean Commission $      122,100,000 0.5522%Acquisitions
2017FacebookDutch Data Protection Authority $                      –   0.0000%Dark Patterns & Consent
2017WhatsappAutorità Garante della Concorrenza e del Mercato (AGCM) $          3,240,000 0.0203%Dark Patterns & Consent
2018FacebookInformation Commissioner’s Office $             643,000 0.0029%Cambridge Analytica
2018FacebookBelgian Court $                      –   0.0000%Dark Patterns & Consent
2019FacebookFederal Trade Commission (FTC) $   5,000,000,000 27.0490%Cambridge Analytica
2019FacebookSecurities and Exchange Commission (SEC) $      100,000,000 0.5410%Cambridge Analytica
2021Facebook/GiphyCompetitions & Markets Authority $        69,690,000 0.1770%Acquisitions
2021MetaCommission Nationale Informatique & Libertés (CNIL)  $        67,950,000 0.1726%Dark Patterns & Consent
2021WhatsappIreland’s Data Protection Commission (DPC)  $      265,500,000 0.6744%Dark Patterns & Consent
2022MetaFederal Judge in San Francisco $      725,000,000 3.1250%Cambridge Analytica
2022InstagramIreland’s Data Protection Commission (DPC)  $      430,000,000 1.8534%Children´s Rights
2022MetaIreland’s Data Protection Commission (DPC)  $        18,647,300 0.0804%Data Breach
2022MetaIreland’s Data Protection Commission (DPC)  $      275,000,000 1.1853%Data Breach
2023FacebookIreland’s Data Protection Commission (DPC)  $      220,500,000  – Dark Patterns & Consent
2023InstagramIreland’s Data Protection Commission (DPC)  $      189,000,000  – Dark Patterns & Consent
2023WhatsappIreland’s Data Protection Commission (DPC)  $          5,940,000  – Dark Patterns & Consent
2023MetaIreland’s Data Protection Commission (DPC)  $   1,320,000,000 5.6897%Illegal Data Transfer
Fines issued to Facebook Inc. & Meta Platforms Inc. regarding privacy

To further explore the nature of the company’s violations, I identified six categories based on the issues that led to the fines imposed on the company. The following Table shows these categories and their corresponding accumulated penalties.

Issue relatedFine or settlement in USD
Acquisitions $       191,790,000.00 
Cambridge Analytica $    5,825,643,000.00 
Children´s Rights $       430,000,000.00 
Illegal Data Transfer $    1,320,000,000.00 
Dark Patterns & Consent $       761,630,000.00 
Data Breach $       293,647,300.00 
TOTAL $    8,822,710,300.00 
Categories of privacy protection violations and penalties imposed on Facebook/Meta

Meta has garnered the highest number of fines from both EU and US enforcement agencies in relation to privacy-related matters. Furthermore, the evidence indicates that Meta has consistently engaged in the same violations without implementing substantial changes to its practices or policies. Andrea Jelinek, Chair of the European Data Protection Board (EDPB), recently remarked, “Meta IE’s infringement is very serious since it concerns transfers that are systematic, repetitive and continuous”.

Meta’s privacy practices have not only caused harm to its users and society but have also negatively impacted its shareholders by affecting revenue and market capitalization. The company’s actions, including data misuse, manipulation attempts, security breaches, and the promotion of social polarization, have resulted in a loss of trust and credibility. Furthermore, the detrimental effects on mental health associated with its platforms, such as Instagram, have further contributed to the negative impact on users and the broader community. As a result, Meta finds itself facing challenges on multiple fronts, as its questionable privacy practices have not only affected its users’ well-being but have also taken a toll on its financial performance and shareholder value.

In my forthcoming paper, “An Analysis of Facebook’s Privacy Practices Through the Lens of Compliance, Integrity, and Excellence,” I will delve deeper into this analysis. Furthermore, in my forthcoming book chapter “Metamorphosis: Facebook’s Strategic Transformation,” I examine the journey of Facebook Inc. into Meta Platforms Inc. through the lens of strategic management. Using the framework of value creation, managing imitation, and shaping the organizational perimeter proposed by Frédéric Fréry I explore ethical considerations, stakeholder management, differentiation, competitive advantage, and strategic decision-making in diversification, outsourcing, integration, and positioning.


[1] https://g.co/finance/META:NASDAQ?window=5Y, Retrieved January 6, 2023,

References
  • About Meta Pixel. (2022). Meta Business Help Center. https://www.facebook.com/business/help/742478679120153?id=1205376682832142
  • Brandom, R. (2018, April 11). Shadow profiles are the biggest flaw in Facebook’s privacy defense. The Verge. https://www.theverge.com/2018/4/11/17225482/facebook-shadow-profiles-zuckerberg-congress-data-privacy
  • Confessore, N. (2018, April 4). Cambridge Analytica and Facebook: The Scandal and the Fallout So Far. The New York Times. https://www.nytimes.com/2018/04/04/us/politics/cambridge-analytica-scandal-fallout.html
  • Daza, M. T., & Ilozumba, U. J. (2022). A survey of AI ethics in business literature: Maps and trends between 2000 and 2021. Frontiers in Psychology13, 8040. https://doi.org/10.3389/FPSYG.2022.1042661
  • Evans, J. (2017, June 4). Facebook is broken. TechCrunch. https://techcrunch.com/2017/06/04/when-you-look-into-the-news-feed-the-news-feed-looks-into-you/
  • Facebook’s Terms of Service (Revision: July 26, 2022). (2022, July 26). Facebook. https://www.facebook.com/legal/terms
  • Golbeck, J. (2020, November 13). Finding Facebook Friends Through Lens Scratches. Psychology Today. https://www.psychologytoday.com/us/blog/your-online-secrets/202011/finding-facebook-friends-through-lens-scratches
  • Hill, K., & Mac, R. (2021, November 2). Facebook Plans to Shut Down Its Facial Recognition System. The New York Times. https://www.nytimes.com/2021/11/02/technology/facebook-facial-recognition.html
  • Hinds, J., Williams, E. J., & Joinson, A. N. (2020). “It wouldn’t happen to me”: Privacy concerns and perspectives following the Cambridge Analytica scandal. International Journal of Human-Computer Studies143, 102498. https://doi.org/10.1016/J.IJHCS.2020.102498
  • Laziuk, E. (2021, September 6). iOS 14 Opt-in Rate – Weekly Updates Since Launch . Flurry. https://www.flurry.com/blog/ios-14-5-opt-in-rate-idfa-app-tracking-transparency-weekly/
  • Learn how Facebook shows you ads on other apps and websites. (n.d.). Facebook Help Center. Retrieved 8 January 2023, from https://www.facebook.com/help/119468292028768
  • Levy, S. (2021, October 25). Facebook Failed the People Who Tried to Improve It. WIRED. https://www.wired.com/story/facebook-papers-badge-posts-former-employees/
  • Milmo, D., & Paul, K. (2021, September 30). Facebook disputes its own research showing harmful effects of Instagram on teens’ mental health. The Guardian.
  • Neate, R. (2018, July 26). Over $119bn wiped off Facebook’s market cap after growth shock. The Guardian. https://www.theguardian.com/technology/2018/jul/26/facebook-market-cap-falls-109bn-dollars-after-growth-shock
  • Parker, S. (2017, November 12). Facebook Exploits Human Vulnerability. YouTube. https://www.youtube.com/watch?v=R7jar4KgKxs&t=71s
  • Paul, K., & Milmo, D. (2021, October 4). Facebook putting profit before public good, says whistleblower Frances Haugen. The Guardian. https://www.theguardian.com/technology/2021/oct/03/former-facebook-employee-frances-haugen-identifies-herself-as-whistleblower
  • Pesenti, J. (2021, November 2). An Update On Our Use of Face Recognition | Meta. Meta Newsroom. https://about.fb.com/news/2021/11/update-on-use-of-face-recognition/
  • Rinehart, W. (2022, February 24). The Facebook stock drop and the price of privacy. The Center for Growth and Opportunity at Utah State University. https://www.thecgo.org/benchmark/the-facebook-stock-drop-and-the-price-of-privacy/
  • Rosen, R. J. (2013). Armed With Facebook ‘Likes’ Alone, Researchers Can Tell Your Race, Gender, and Sexual Orientation.The Atlantic. https://www.theatlantic.com/technology/archive/2013/03/armed-with-facebook-likes-alone-researchers-can-tell-your-race-gender-and-sexual-orientation/273963/
  • Thomas, E. (2018). A creepy Facebook idea suggests friends by sensing other people’s phones. WIRED UK. https://www.wired.co.uk/article/facebook-phone-tracking-patent
  • Wylie, C. (2019). Mindfuck: Inside Cambridge Analytica’s Plot to Break the World (1st ed.). Random House USA.

Leave a Reply

Your email address will not be published. Required fields are marked *