A while ago I found a very small book tucked in among others about design in the Museum Shop at an art gallery in Brussels, it was called “Design my Privacy” by Tijmen Schep. It turned out to be a fascinating reading about how to design services and products to make sure that they strive to make our environment both smart and privacy friendly.
Design has to be safe, privacy aware, ethical and socially responsible, writes Schep. If not, big data can be misused in powerful ways to carry out crimes or to discriminate. In the area of crime, the most obvious example is the German population registers. It included data about what religion a person belonged to, and was used to identify and deport Jews during World War II. Another example Schep gives is that of the hacking of ‘Ashley Madison’ in 2015, an online dating site for people seeking to cheat on their partner. Data of 30 million members was made public, wrecking innumerable relationships and at least two people committing suicide. The most chilling example of discrimination is the example of the Chinese ‘good citizenship social credit score’ that will be in place by 2020. It compiles data on the basis of someone’s purchases, criminal record, financial situation, and conduct and communication on social media. Those that receive high enough score will find it easier to for example obtain a loan or a visa for travel abroad. For those of you that watched Netflix “Black Mirror” might make the same association as I did to episode 1:3, realising it is no longer fiction about the dark side of technology as the tv-serie aims to show. More subtle forms of discrimination are those of the cost of insurance determined by your data, for example in USA you can have a car insurance based on the monitoring of your driving behaviour. In countries where citizens trust the government, privacy is not perceived much as a concern. People tend to think they have nothing to hide from the police or other institutional powers. Although, the example of Germany during World War II illustrates a development no one could foresee, likewise we cannot tell how data about ourselves today can be used against us tomorrow. This is why Schep outlines eight principles for better privacy design. While these principles are mainly for designer they also serve to enlighten users to be more critical to what products and services we choose.
Privacy has to come first (principle 1). Unfortunately, privacy is often considered a luxury, for those that have money to buy additional services or knowledge to for example install special encryption on their email. People are starting to realise though, at least in relation to social media, that our information exchange is actually a form of payment. Although, we do not reflect much about privacy for the devices that we have purchase for our home, which we connect to the Internet (so called the ‘Internet of Things’). Neither does the designer of such products always consider privacy. Secondly, be the devil’s advocate and think naughty (principle 2) in order to anticipate how your product can be misused. In fact hire a security expert or an ‘ethical hacker’ to test your product during the design phase. Thirdly, collect as little data as possible (principle 3), don’t store longer than necessary, don’t use for other purpose than those determined in advance, don’t transfer, sell or lease to commercial parties without prior permission, and store safely. Schep praise the European Union for offering better privacy protection than the USA. Although, I am convinced, if asking the leading European Digital Rights NGO they would argue that there are still too many loopholes and much to be done to ensure digital privacy rights.
In the end of the day we should not forget that, “‘the cloud’ is.. a pretty name for ‘someone else’s computer’ …in a particular country, and subject to that country’s particular laws on privacy and data property”, writes Schep. What he means is that it is important to not be mislead by terminology such as the idea of a borderless ‘cloud’ and think critically to protect your data (principle 4). Users of online services should be able to separate their online identities, knowing their privacy and have their data protected. In our offline life we share different information with our boss, parents, friends and partner, and we can be sure that sensitive information do not reach the wrong person; for example we might not want our boss to know what we tell our friend, or our parents what we share with our partner. In the same way, do we need a digital system that understands the complex multiple identity structures (principle 5). The user need to be able to know how a device works, if not it can become difficult to understand what it is capable of and who or what it communicates. In other words the ‘Black Box’ should be opened for the user (principle 6). In fact, Schep’s argues that it is important to make the user a co-designer (principle 7) giving them the agency to learn and discover their smart environment. Final point, technology is not neutral (principle 8), programming can be compared with legislation, it is subject to human efforts and it includes holes, exceptions and unforeseen side-effects.
Additional to Schep’s eight principles I would include the importance of involving the users and beneficiaries in the initial design of a product or service, especially to ensure an accessible design for all. Juliette Piazza, CEO of Inclu-sight, is a great example of a initiative to offer a pool of people with disabilities that can test new products and services – a target group that developers often excuse themselves with that they cannot reach.
Would you have added any principles? Let me know!