The Cambridge Analytica and Facebook Scandal


The legality and ethics of companies are often closely related as most governments try to prohibit practices that could harm the public and individuals. However, more and more legal but unethical practices are manifested in the work of technology companies since states cannot predict all the new nuances of emerging technologies that can harm citizens but bring profit to a business. Thus, the use of some practices is left to the consideration of companies, which constitutes and forms their corporate social responsibility. In addition, violating laws and regulations is also a practice that not only leads to fines and courts but also harms the company’s image as socially responsible and destroys brand value. This report will consider examples of unethical activities of companies that harmed not only their work but also had consequences for several states.

One such incident occurred during a vote by the UK population to leave the European Union and presidential elections in the US. In 2014-2016, Cambridge Analytica, a technology-driven analytic company, used the data of 50 million Facebook users to create their political profiles (Cadwalladr and Graham-Harrison, 2018). As a result, information was sold to influence the results of Brexit and the presidential elections in favor of Donald Trump (Cadwalladr and Graham-Harrison, 2018). This report will use such research methods as a review of journalistic and scientific literature and critical analysis.

The ethics of Cambridge Analytic and Facebook, their impact on corporate social responsibility and corporate reputation, and consumer confidence in the brand will be addressed in the report. In addition, the reaction of the company’s leaders and the marketing communication strategy and message after the broker scandal will also be studied and analyzed. Thus, the main purpose of the report is to examine the legal and ethical issues related to the activities of Cambridge Analytic and Facebook to show the importance of marketing, public relations, and corporate social responsibility for the value of the company’s brand.

Research Method

The primary research method in this report is a literature review. Key figures, events, and facts were obtained from reliable publications such as Guardian, The New York Times, and The Wall Street Journal. Scientific articles were found through Google Scholar and the Proquest database. The key searching phrases were “corporate social responsibility”, “code of conduct”, “public relations”, “crisis response”, “data breach”, “Facebook”, and “Cambridge Analytic”. Articles over five years old, except one, and sources that study such aspects of social responsibility as environmental protection were excluded from the search. Textbook chapters on corporate social responsibility and marketing communication have also been used to find and apply the theories to the case. All sources were reviewed and analyzed to select arguments and form the position of this report.

The Ethics Issue of Cambridge Analytics

Cambridge Analytica is an analytical company that worked for Trump’s campaign during the 2016 U.S. presidential election. This fact is not a secret as there are documents and contracts confirming their cooperation. However, for a long time, the company’s methods of work were unknown, which were revealed only after Trump’s victory and the vote of the population of the United Kingdom for Brexit (Cadwalladr and Graham-Harrison, 2018). Some of the information is still confusing as testimonies from different parties change and remain unconfirmed even due to court decisions. However, in any case, the actions of Cambridge Analytica were illegal and unethical as they influenced the political will of the citizens.

Years of investigations and testimony from Cambridge Analytica employees revealed that the company provided Trump with services that helped him win the 2016 election. Cambridge Analytica used Facebook user data collected through the application called “This Is Your Digital Life” (Cadwalladr and Graham-Harrison, 2018). Users were paid to take psychological tests through the application, logged in with a Facebook account and were told that their answers would be used for academic purposes (Cadwalladr and Graham-Harrison, 2018).

However, the information was used for political profiling to target ads and persuade users to vote for Trump. Moreover, the application also collected information about users’ friends, which allowed the company to collect data on 50 million potential voters (Cadwalladr and Graham-Harrison, 2018). This amount of data helped Trump’s proponents or consultants manipulate voters’ opinions and persuade them to vote for Trump.

At the same time, various sources and Facebook comments show two versions of how Cambridge Analytica obtained the data. Most often, this event is called the most significant breach of information, but since Facebook was not hacked, the leak does not fall under this classification (Wong, 2019, para. 8). In fact, Cambridge Analytica received data with the consent of Facebook but broke the rules by transferring information to third parties (Cadwalladr and Graham-Harrison, 2018).

However, in both cases, the company’s actions were unethical. Moreover, there is information that CEO Alexander Nix of Cambridge Analytica approached Wikileaks founder Julian Assange. Wikileaks later published Hillary Clinton’s private emails that were used against her and the company is considered related to Russian interference in the US presidential elections; thus, Nix’s connection with Assange suggests the involvement of Cambridge Analytica in this situation (Ballhaus and Bykowicz, 2017). Therefore, all the available information demonstrates that Cambridge Analytica does not consider corporate social responsibility as its primary concern.

Cambridge Analytica’s work for Trump’s company is unethical for two reasons. First, by using data, the company deliberately neglected its privacy and demonstrated that data-driven research could be used to harm. Wilson (2020), analyzing the Cambridge Analytica case, uses the rules of moral responsibility for computer artifacts developed by the scientific community and shows that the company has violated several of them.

The first rule states that people who create, design, and build computer artifacts are morally responsible for them and the possibilities for their use (Wilson, 2020, p. 593). In other words, Cambridge Analytica and the developers of the application knew that the data would be used in favor of only one person but violating the political sovereignty of other people. However, they did not want to be morally responsible by denying their role in the scandal.

The second rule says that each party involved in the development of artifacts carries full responsibility, which does not decrease regardless of the number of parties (Wilson, 2020, p. 593). This rule means that Facebook and Cambridge Analytica cannot deny their role in the scandal by shifting responsibility to other parties. The fifth rule says that people who create, promote, and design an artifact should not deceive users about the goals and results of its use (Wilson, 2020, p. 593). Cambridge Analytica clearly violated this rule as it lied to users about the purposes of using the collected information and its volume. Thus, deceiving users to collect information was unethical and violated the rules of socially responsible business conduct.

Moreover, Cambridge Analytica’s work on Trump was unethical because it used data to manipulate the population’s political views. Users saw information on Facebook that would persuade them to vote for Trump or Brexit, and it was often distorted or fake (TED, 2019).

Understanding the pains and interests of customers is the basis of any marketing strategy, including an important aspect of targeted advertising. As a company with experience in this area, Cambridge Analytica understood that this approach would be effective for promoting Trump and, nevertheless, agreed to cooperate. The greatest unethicality in this situation is that micro-targeting had an implicit impact on users, which reduced the possibility of critical analysis. Thus, Cambridge Analytica was not supposed to work for Trump or any other politician using these methods as it facilitated political manipulation that influenced the election results.

Possible Response Strategy of Cambridge Analytica

Corporate Social Responsibility (CSR) is a vital part of any business operations as it builds a positive image for customers and increases brand value. As Singh and Verma (2017, p. 2) point out, CSR initiatives are beneficial for companies as they increase customer loyalty and their willingness to buy a product of socially responsible brands. For this reason, Cambridge Analytica had to use CSR practices to build its positive image and reduce the negative consequences of the scandal.

CSR initiatives can take many dimensions, from favorable HR practices to environmental protection. Since Cambridge Analytica was a data mining and analysis company, the main direction of its brand and image development as a socially responsible organization was to demonstrate its sources and methods of information collecting. For example, if Cambridge Analytica published reports that it collects data from surveys and open sources and uses it for the purposes stated in the data collection agreement, it would build its brand and gain the trust of both customers and people providing their data. In addition, social initiatives such as participating in charity, providing assistance to local communities, and transparent business conduct would actively consolidate this status.

Another step that Alex Nix needed was the creation and observance of the Code of Conduct, which would not allow the company to resort to those methods of collecting and using information that caused the scandal. However, the evidence suggests that working for Trump was a deliberate violation of ethical rules for the sake of profit or personal political reasons of the company’s leaders. Thus, it is likely that the company was not going to follow the rules. However, pre-crisis CSR initiatives could mitigate the impact of the breach as the company would have a socially responsible company image.

Eisenegger and Schranz (2011) and Kraesgenberg, Beldad, and Hegner (2017) argue that the negative impact of the crisis can be mitigated if the company used CSR practices before the incident because the public is more willing to believe its claims. Therefore, aside from refusing to use unethical media, the main step Nix could have taken to reduce the impact of the scandal on the company was the application of CSR initiatives and their media coverage.

Steps aimed at overcoming the crisis after the publication of the accusations are also related to CSR, marketing communication strategy, and public relations. First, Nix had to react immediately and comment on the allegations on social media since, according to the rules of crisis response, the first hour after the incident is the most critical (Coombs, 2020, p. 103). As the next step, Cambridge Analytica had to adopt one of the corporate apologia strategies. In this case, Nix used a strategy of denial (Coombs, 2020, p. 107). Nix stated that Cambridge Analytica gathered information about users that was not received from Facebook. (Cadwalladr and Graham-Harrison, 2018, para.15).

In other words, he talked about the legality and ethics of the methods of collecting and using information. From an ethical point of view, it would be more appropriate to apply the strategy of impression management, offer an apology, accept responsibility, and assist in the disclosure of political manipulation (Coombs, 2020, p. 108). Thus, the company could focus not on its wrongdoing but on its contribution to the investigation.

The final step that could contribute to the preservation and restoration of Cambridge Analytica’s reputation should have been an increased emphasis on CSR practices. Simultaneously, all the changes undertaken had to be broadcast in the media and reach the audience. Thus, the public would perceive Cambridge Analytica’s steps as an admission of mistakes and a desire to improve its business practices (Kraesgenberg, Beldad, and Hegner, 2017).

This approach can be effective in restoring the company’s reputation, as the examples of Nestle, Shell, and Nike demonstrate. The company could focus on the transparency of mechanisms for collecting information and cooperation with clients. Cambridge Analytica could also contribute to the development of legislation that helps to strengthen the confidentiality of the information and the rights of users of Internet resources. Thus, these measures would not protect Cambridge Analytica from negative public reactions and damage to its reputation; however, they would help mitigate the negative impact and help the company recover.

Strategy and Future Challenges of Facebook

Facebook also had a central place in the Cambridge Analytica scandal. Facebook has been accused of its privacy policy making it easy for Cambridge Analytica to harvest data, although Zuckerberg says the app developers violated the rules (Cadwalladr and Graham-Harrison, 2018). However, Facebook’s more significant problem was that it did not report the information breach back in 2015 and did not ban the app and micro-targeting (Cadwalladr and Graham-Harrison, 2018, para. 27). However, Facebook chose a different crisis response strategy than Cambridge Analytica. Zuckerberg apologized and promised to adjust its privacy and social media advertising policy (Winder, 2019). This step was the most crucial for further work and development of the company.

Nevertheless, Facebook made several mistakes in its response to the crisis, which led to negative consequences and a decrease in brand value for users. First, Zuckerberg gave his first comment on the situation only five days after the article was published and the first accusations (Wong, 2019, para.2). In the era of Internet technologies, the response of the company should be given within an hour, and additional comments within 24 hours, since at this time, the main news stream of information is formed (Coombs, 2020, p. 103). Thus, Facebook partially lost control of the situation as it left room for guesswork, manipulation, and user worries.

As a result, thousands of accounts were deleted, and the company lost profits (Larson and Viereggerpp, 2019, pp.102-103). Second, Facebook was inconsistent in its responses, which also led to a loss of user confidence. Initially, Zuckerberg said that he banned the application used by Cambridge Analytica to receive users’ information and required the company to delete data in 2015, although later, it turned out that he sent the letter in the second half of 2016 (Cadwalladr and Graham-Harrison, 2018, para. 27). Thus, such details could have diminished user confidence, and years after the scandal, Facebook still faces legal and ethical challenges.

First, Facebook’s investigation after the scandal exposed its weaknesses and forced Zuckerberg to make changes. For example, Facebook suspended tens of thousands of applications and banned some of them, and also sued developers who refused to cooperate in the investigation (Rodriguez, 2019). However, information leaks and targeting remain issues that Facebook has not resolved. For instance, in April 2021, data on 500,000 billion users became widely available on the Internet, but this incident had little impact on Facebook’s financial performance (Forman, 2021). However, such problems diminish user confidence in the social network.

For example, the Cambridge Analytica scandal led to the deletion of thousands of Facebook accounts and a loss of profits in March 2018, at the pick of scandal (Larson and Viereggerpp, 2019, pp.102-103). In addition, such cases indicated the need for stricter regulation and legislation that will prevent such cases.

Second, Facebook’s need to collect data and target ads is an ethical issue that has no immediate solution. Abandoning the practices of big data collection could be a solution to the privacy issue as leaks more often happen not due to hacks but due to non-compliance with the rules by such companies as Cambridge Analytica. Therefore, the privacy problem would be solved if Facebook used an encryption system that would protect the user’s data but not collect it. However, Facebook makes most of its profits by selling targeted advertising data and services (Forman, 2021). Thus, despite the desire of the company to increase the level of confidentiality of the data of its users, it cannot abandon its methods without losing profits.

However, the Cambridge Analytica scandal and other data mining issues may soon lead to legislation restricting the activities of companies such as Facebook. For example, The General Data Protection Regulation (GDPR), adopted in 2018, quite severely restricts the use and processing of data of EU residents by any company and imposes fines of up to 4% of global revenue (“What is GDPR,” n.d.). The rules include limited purposes, periods, and ways of storing data that may be disadvantageous to many companies (“What is GDPR,” n.d.). Similar laws may also be enacted in the United States to restrict the collection of data for commercial purposes, which could result in the loss of a significant portion of Facebook’s profits.

Yet despite Facebook’s privacy concerns and lower user trust, the company’s profit continues to grow. This situation is likely due to the privacy improvement message that the company is broadcasting. For example, the merger of Facebook, Instagram, and WhatsApp, owned by the company, and their more reliable encryption should give users a secure platform for their messages, according to Zuckerberg’s idea (Winder, 2019). This approach to marketing communication leads to an increase in the number of users and profits (Forman, 2021). Consequently, the development of technology requires the adoption of laws that can impede the operation of Facebook.

However, the company’s current communication strategy of building a socially responsible image allows it to continue collecting data and targeted advertising without significant reputational concerns. Thus, if Facebook takes measures to avoid situations like the Cambridge Analytica scandal, it will be able to continue its work and gradually adapt to the new demands of the market and society.


A review and analysis of the Cambridge Analytica scandal demonstrate that corporate social responsibility and marketing communications are core business practices. The examples of Cambridge Analytica’s and Facebook’s reactions to the scandal are vastly different and had different consequences for companies. Cambridge Analytica denied any guilt and did not take steps to rebuild its reputation after the incidents, making it a major offender in the perception of the public. At the same time, Facebook admitted its mistakes and announced its intention to correct the situation with the privacy of users’ data.

This focus of Facebook on change, the formation of the image of a socially responsible company, and communication with the audience both through social networks and well-known media were vital steps. This approach allowed Facebook to maintain its brand value and not abandon practices that made information leakage possible. Facebook also continues to use the targeting mechanisms that Trump’s political campaign has applied as they generate most of the revenue for them. However, the public image of the company was largely unaffected due to Zuckerberg’s response and communication strategy.

Nevertheless, this scandal opened the public’s eyes to the severe shortcomings of data confidentiality and its use and made people think about the adoption of new laws. Not long after the scandal, the European Union adopted new regulations, and each new incident may lead to the adoption of similar laws in other countries of the world. These laws will also focus on the transparency of advertising mechanisms and policies, including targeting to protect consumers. These changes could have a significant impact on Facebook, which is primarily profitable from data collection and targeted advertising services. Nevertheless, at the moment, an effective communication strategy allows Facebook to adapt to new conditions gradually and avoid severe adverse changes.

Reference List

Ballhaus, R. and Bykowicz, J. (2017) ‘Data firm’s WikiLeaks outreach came as it joined Trump campaign’, The Wall Street Journal. Web.

Cadwalladr, C. and Graham-Harrison, E. (2018) ‘Revealed: 50 million Facebook profiles harvested for Cambridge Analytica in major data breach’, The Guardian. Web.

Coombs, W.T. (2020) ‘Conceptualizing crisis communication’, in Heath, R.L. & O’Hair, H.D. (eds.) Handbook of risk and crisis communication. New York: Routledge, pp. 99-116.

Eisenegger, M. and Schranz, M. (2011) ‘Reputation management and corporate social responsibility”, in Ihlen, O., Bartlett, J.L. and May, S. (eds), The handbook of corporate social responsibility. Malden, MA: Blackwell Publishing, pp. 128-146.

Forman, L. (2021) ‘Facebook’s data breaches don’t matter, until they do’, The Wall Street Journal. Web.

Kraesgenberg, A.-L., Beldad, A.D., and Hegner, S.M. (2017). ‘Restoring trust and enhancing purchase Intention after a crisis through a corporate social responsibility program and a specific response strategy: an abstract’, in Rossi, P. (ed.) Marketing at the confluence between entertainment and analytics. Cham: Springer, p. 587.

Larson, E. C. and Vieregger, C. (2019). Teaching case: strategic actions in a platform context: what should Facebook do next? Journal of Information Systems Education, 30(2), pp. 97-105.

Rodriguez, S. (2019) ‘Facebook has suspended tens of thousands of apps after Cambridge Analytica investigation’, CNBC. Web.

Singh, A. and Verma, P. (2017) ‘Driving brand value through CSR initiatives: An empirical study in Indian perspective’, Global Business Review, 19(1), pp. 85–98. Web.

TED (2019) Facebook’s role in Brexit — and the threat to democracy | Carole Cadwalladr. Web.

Wilson, R. (2019) ‘Cambridge Analytica, Facebook, and influence operations: a case study and anticipatory ethical analysis’, in edited Cruz, T. and Simoes, P. (eds.) ECCWS 2019 18th European conference on cyber warfare and security. London: Academic Conferences and Publishing International Limited, pp. 587- 595.

Winder, D. (2019) ‘Facebook privacy update: Mark Zuckerberg’s response to Cambridge Analytica scandal one year on’, Forbes. Web.

Wong, J.C. (2019) ‘The Cambridge Analytica scandal changed the world – but it didn’t change Facebook’, The Guardian. Web.

Cite this paper

Select style


BusinessEssay. (2023, January 15). The Cambridge Analytica and Facebook Scandal. Retrieved from


BusinessEssay. (2023, January 15). The Cambridge Analytica and Facebook Scandal.

Work Cited

"The Cambridge Analytica and Facebook Scandal." BusinessEssay, 15 Jan. 2023,


BusinessEssay. (2023) 'The Cambridge Analytica and Facebook Scandal'. 15 January.


BusinessEssay. 2023. "The Cambridge Analytica and Facebook Scandal." January 15, 2023.

1. BusinessEssay. "The Cambridge Analytica and Facebook Scandal." January 15, 2023.


BusinessEssay. "The Cambridge Analytica and Facebook Scandal." January 15, 2023.