Engineering Privacy into Software

Engineering Privacy into Software

The movement towards individual privacy protection has been driven by a public awareness that was slow to appreciate the risks of unprotected data.  As collective awareness grows, bills proposing the regulation of personal data are multiplying across most jurisdictions in the developed and the developing world. The implication here is that businesses will face a greater risk of legal sanction and brand damage if they do not get their data warehousing in order. What follows are a series of proposals which can be incorporated into software design and development,  offering valuable changes that will become necessary sooner rather than later.

Data is not the new oil – it’s the new plutonium. Amazingly powerful, dangerous when it spreads, difficult to clean up, and with serious consequences when improperly used.”

–Jim Balsillie, Centre for International Governance Innovation

Emergent themes from this international patchwork of privacy legislation are mirrored in the non-binding frameworks that came before (See Privacy by Design or the OECD Fair Information Practices). Different political systems and cultures share an appreciation for the value of privacy, but there are some nuanced differences between them.  For example, Europeans regard information as the property of the data subject, while lawmakers in the People’s Republic of China disagree, choosing instead to draft bills focused on a right to reputation, restricting data usage accordingly. The resulting patchwork of laws is changing the business of data processing as we know it. In many ways, North America has been slow to adopt privacy protections, but, in spite of this, we are already seeing the effects of foreign laws in our online experiences. In 2018, for example, popup notices for cookie storage became popular overnight when the General Data Protection Regulation (GDPR) became enforceable in May of that year.

Profiting with Privacy by Default

Taking for granted that privacy is valuable to all, let’s get to the opportunity therein – profiting with privacy by default.

1. Demonstrate your commitment

Doing so early and visibly can make it clear that your product – your brand – takes your customers’ privacy – their safety – seriously. In the previous example, most businesses went live with GDPR cookie notices at the latest possible moment, or shortly thereafter. In each case, their actions implied an organizational attitude towards privacy of indifference at best or aversion at worst. This is not a good look, particularly as the concept of privacy continues to overlap with personal safety. Integrating privacy into a brand isn’t a new idea, and it’s already being leveraged by big consumer technology companies, like Apple and Blackberry.

2. Hand over control

At bottom, information privacy is about control. Giving users the ability to decide how much data they would like to share can embolden them to share more.  Providing options for obfuscation, generalization, and minimization of personal data at the point of data collection has interesting effects on disclosure behaviors. A study of user decision-making observed that users who are unwilling to share their personal data are more willing to share obfuscated data when signing up for a service offered by a fictitious vendor.  Furthermore, the evidence suggests that the presence of an obfuscation option results in the sharing of more unaltered data than the group where this option was absent. In short, users who felt that they had greater control over a situation were more likely to take risks. Not all of a customers’ personal data is a must-have for service providers, but having that data on hand can be useful. Affording customers the ability to choose such options morphs the consent decision from a boolean choice into a categorical one.

Developer wisdom can also be leveraged to create a bespoke privacy solution. As with security, it is easier and cheaper to incorporate privacy earlier rather than later, when concepts are fluid and unburdened by the code and infrastructure which implements it.  Getting developers to consider privacy through structured exercises is a great way to start the design conversation. When confronted with blueprints for dystopically invasive technologies, would-be coders seem to instinctively empathize with the user and contextualize the technology within present-day society. The resulting mindset is likely rooted within shared individual values, leading to the consideration of design options which are privacy-promoting rather than simpler and more invasive choices.

3. Train your developers

The circumstances in which developers make decisions that favor privacy are interesting. A behavioral experiment observed that developer-subjects would reliably choose to minimize data collection when asked to code an API module for a mobile app.  In this case, highly-specific geolocation data was often avoided in favor of less granular data, such as postal code, provided project requirements were met. Interestingly, the decision to limit data usage was found to be independent of experience level. Although, less experienced programmers were found to write code that was closely aligned with the code samples provided in the API documentation. The latter observation suggests that making such code samples readily-available could speed up the process.

Data Protection Cannot Be The Exclusive Responsibility of the Consumer

Yet, there are software applications produced by otherwise well-meaning groups which end up abusing personal information, causing unintended damage and the loss of market share. Governments have taken notice and continue to introduce disruptive legislation, (see GDPR, CCPA) with more on the horizon Incorporating privacy into software products can be troublesome, and, in some instances, it might be better to leave the choice to the user.

While common sense practices are to be encouraged, the adoption of individual-level data security is inconsistent at best.  The responsibility for data protection cannot fall solely to the consumer. Moreover, consumers tend to react negatively to brands associated with privacy breaches, perceiving them unfavorably. As consumers might be willing to share their information with someone if they receive value in return, it could be expected that this effect would be mitigated by their trust in the provider.

Becoming a good steward of consumer data doesn’t need to be cost-intensive and can serve to distinguish a product within its market segment. Viewing user privacy as a value-added feature can be more productive, innovative, and profitable if viewed as an opportunity rather than a burden.