On public engagement privacy, Cambridge Analytica, and user trust
Public engagement privacy
Public engagement privacy is fundamental to meaningful civic engagement. Meaningful engagement requires trust. Participants must feel at ease knowing that organizations will respect their privacy and protect their personal information. When trust deteriorates, participants will simply stop providing input.
The data propagation practice that Facebook had in place, uncovered during the Cambridge Analytica case, highlights once more the incompatibility between meaningful online civic engagement and ad-supported platforms. While large commercial platforms are incentivized to protect the privacy of its users, their pursuit of economic interests that rely on leveraging their personal information is at odds with these incentives.
Deception, by design
While it is true that users can prevent having their data shared by their friends and that Facebook recently committed to doing more to protect user data, a recent CNET article shows that it takes users over 24 clicks to change their account default settings.
Facebook users are right to feel betrayed, even when Facebook hasn’t done anything illegal. As a business pursuing economic incentives and one which value is found in the personal information of its users, it shouldn’t surprise us that it was in Facebook’s best interest to leverage this data. One could argue that Facebook could have done more to let users know that their data was being shared without their consent. One could also argue that Facebook lied by omission. The accusations grow with anyone’s imagination.
The reality is that businesses are destined to pursue value creation at almost all costs, even when masqueraded behind noble missions such as Bringing the World Closer Together. So, for as long as businesses rely on centralized control, or their ability to show advertising to users based on their preferences, or any other business models that leverage the data of its users, the trust with which users reward these businesses will be put to the test over and over.
The dark side of privacy regulation
As Facebook continues its shift from being a platform business to consolidating its dominance in advertising, we will be seeing fewer cases of user data propagation. After all, an advertising aggregator’s advantage lies in their ability to offer advertisers powerful targeting capabilities. Therefore, Facebook’s interest in keeping this data to themselves grows as revenues continue to come from their ad business.
On the surface this is a good thing; users can again trust that Facebook will not treat their privacy carelessly. On the other hand, the data that was available to harvesters like Cambridge Analytica was the same data that enabled many startups to get off the ground and some, to turn into huge businesses. Another downside is the call for increased regulation. Facebook can afford to jump through hoops to comply with increased regulation. Startups are the ones that will have a harder time spending the money to comply.
Achieving public engagement privacy
So, in this world of centralization and increased regulation, online public engagement practitioners must remain alert to the tradeoffs that come with using advertising-supported tools. These tools or those that tempt developers to build on their platforms in exchange for user information will always be in conflict with building lasting trust with users. Practitioners must use tools that are designed to protect users’ privacy and encourage meaningful public participation for the long term.
(Photo by Kaique Rocha from Pexels)