Corporate Social Responsibility is a broad and controversial topic that revolves around the corporations’ activities. The main argument of its proponents is that businesses have to act in a way that not only benefits their bottom line but also positively affects the community. The specifics vary from individual case to individual case, and the precise effects of enacting socially responsible corporate policies are challenging to define. The only certainty is that the adoption of Corporate Social Responsibility is becoming a valuable point of advertising for some companies, and grounds for boycotting others.
Perhaps one of the most significant recent online data privacy scandals is the Cambridge Analytica case. The UK-based company was accused of stealing the Facebook data of tens of millions of users and manipulating their Facebook experience to foster specific political behavior. The case has very broad-reaching implications for the sanctity of people’s online privacy, as well as the fundamental freedom of choice when it comes to politics, or, indeed, anything at all. The Facebook user data acquired by Cambridge Analytica was linked to their likes and dislikes, reactions to specific ideas, and character traits.
The company analyzed these traits and targeted users with specific messages, primarily related to the 2016 presidential election. According to Confessore (2018), the company aimed to push pro-Trump messages to boost his ratings in the US, as well as support pro-Brexit positions in the UK. Judging by the Cambridge Analytica whistleblower’s testimony, that was the company’s business plan (Tillett, 2018). Rather than it being a nefarious political plot, as one could easily assume, it was a user manipulation program for hire.
It appears that, by using data collected by Facebook and adjacent analytical companies, anyone could manipulate digital ads shown to users. The implications of the scandal on the entire digital advertising market are ominous. It is very realistic to imagine that brands could weaponize the wealth of user data to target specific messages to specific audiences in specific media with pinpoint accuracy. Collecting such information as character traits, in-depth demographics, and preferences without the user’s consent is unethical on its own. Using it to influence opinions and frame certain products or political candidates in a manipulative way is even more unethical. Regardless of the definition of Corporate Social Responsibility, it is easy to assume that “not spying on people against their will” would fall within most of them.
The Cambridge Analytica case features several stakeholders that stood to benefit from the ethics breach. First and foremost, Cambridge Analytica itself, as well as SCL Group, which Cambridge Analytica was a subsidiary of, stood to gain from exploiting the data, as it appeared to be their business model. Secondly, the Republican party in America, Donald Trump, as well as pro-Brexit UK politicians, were employing the company’s services. Thirdly, millions of Facebook users, whose data was acquired by Cambridge Analytica, were a major stakeholder, as their entire personalities were manipulated as a commercial product and, arguably, an asset in a so-called culture war. Finally, Facebook itself is a stakeholder in the scandal, as it was Facebook that collected all that data for its own purposes. Cambridge Analytica, while indeed a guilty party, pales in comparison to Facebook, as the social media website is nefarious for exploiting its enormous user base for dubious research and social experiments.
Defining online data privacy as Corporate Social Responsibility may be problematic to some extent. Most people consider Corporate Social Responsibility to be comprised of such things as the adoption of positive practices that are unnecessary to the business model. Examples include charity programs, community development efforts, and increased employee benefits. Ensuring data safety and online privacy during the use of the website could fall into the same category as producing phones that do not explode in the users’ hands or cough medicine that does not cause cancer. However, Carroll’s CSR Pyramid describes several layers of social responsibility, two of which are legal responsibility and ethical responsibility (Dudovskiy, 2012).
Legally, users have the right to freedom of expression and privacy, which may be violated by Facebook’s non-consensual data collection, and selective suppression and amplification of political messages. While the legal side of the issue may be uncertain due to its novelty, it is ethically unjust and harmful, which falls squarely into ethical responsibility part of the CSR Pyramid.
There is a cynical point of view that businesses do not engage in Corporate Social Responsibility out of kindness. Sometimes, what leads them to adopt these practices is the PR crisis and the plummeting stock (‘Corporate social responsibility’, 2009). It appears to be the case with Facebook, as the company is notorious for doing unethical research on its unsuspecting public. In this case, the crisis was not the fact that some rogue actors illegally procured in-depth personal data of tens of millions of living, breathing people. The crisis was that they got caught doing it, and Cambridge Analytica lost its customers due to public outrage. Adopting an ethically-responsible privacy policy would merely be damage control for them.
For any digitally-present business, adopting a policy that does not include collecting inordinate amounts of data should be an easy decision. It should be, but it is not, because the data-driven digital advertising market is worth hundreds of billions of dollars (Enberg, 2019). Everything from customer demographics to social media engagement metrics is valuable assets for intelligent business management. According to Orlitzky (2013), there is no definitive answer to whether Corporate Social Responsibility is beneficial to a business, has no effect, or can even harm it. Whether or not to draw a line on collecting specific user data, whether or not to spend increasingly large sums of money on cybersecurity, and whether or not to consider ethical issues at all are problems without solutions. The media and even the users have demonized Facebook, yet it remains an incredibly successful business that is the second-largest stakeholder in the digital advertising market (Enberg, 2019). At the same time, a lack of public trust can cripple a smaller company, which is what happened to Cambridge Analytica.
After being accused of an ethics breach, we sat down and had a long discussion of what our priorities are. We discovered we are not Facebook and that the very real people that work in our company are not comfortable with treating the user base like a commodity. We discovered we are not Cambridge Analytica, and we are not going to make unethical and unlawful manipulation our business model.
We are updating our security protocols, and we will attempt to make a shift towards as little data collection as our bare economic necessities would allow. There are privacy issues that are the result of malicious actors, and there are privacy issues that are ingrained in a company by design. We will protect anything that can be misused as a result of the former, and we will eradicate everything that can be an example of the latter. Our customers are precious living beings, and we will do our best to make them feel safe and secure because we have the ethical responsibility to do so.
References
Confessore, N. (2018). Cambridge Analytica and Facebook: The scandal and the fallout so far. Web.
Corporate social responsibility. (2009). Web.
Dudovskiy, J. (2012). Carroll’s CSR Pyramid and its applications to small and medium sized businesses. Web.
Enberg, J. (2019). Global Digital Ad Spending 2019. Web.
Orlitzky, M. (2013). Corporate social responsibility, noise, and stock market volatility. The Academy of Management Perspectives, 27(3), 238-254.
Tillett, E. (2018). Christopher Wylie: Bannon wanted “weapons” to fight a “culture war” at Cambridge Analytica. Web.