Issues identified as pertinent to the discussion around privacy concerns in project design
- issues around personal data – looking to recruit students to engage with things
- open data evangelists vs privacy fascists – middle ground is hard to design around as the two communities don’t talk to each other
- how do we ‘bake in’ privacy and security into how projects are funded, designed – and giving those in the field the capacity to implement
- let’s avoid “privacy by design” as terminology – can implemented as a checkbox exercise, not enough thought about how it applies to the work
- open data and development – how to build in data ethics?
System design
– can have naive system designs (e.g. a broker for health insurance, where any company can request information – you can do that differently) – not enough awareness of potential solutions – can work with large commercial partners to probe the issues around personal data and privacy – employ multidisciplinary approaches – internet of things – interested to produce prototypes and field them in reasonable numbers and then do ethnographies and then policy work. Finding out how a smart object is used in the home. – could this just be commercial validation? Willingness to revisit fundamental design aspects will be pretty low – you’re not going to fundamentally challenge whether you need something. E.g. smartmeters – turned out the sampling rate was less than one minute, by correlating the data with TV programming, they could give with a reasonable amount of confidence, could show what TV programme they were watching. Power differential – was surveillance for the benefit of the power company, not transparency of the power company (e.g. saying when you could turn off an appliance to save extra money).
Life cycles – frame for working through the issues
– how to map the life-cycle of a project to know at what points you can intervene to change things – issues for development sector that are different from the private/government sector – are there things others do well/they do well? – analytics around impacts of programme – how to assess impact on privacy and security – baselines – independent criteria; e.g. certain kinds of data you don’t collect without consent – are there universalities? – looking at harm stories and failures – specific technologies that are higher risk – do we want to think of a vertical approach (where each sector has its own rules) or are there general rules E.g. baselines for when you work with mobile data, geolocation data.
Types of technologies
– most ebook readers have in their terms and conditions that they can collect your reading data for the purpose of improving the service. Looks at how quickly you read, what pages you read. The data is held by a cloud provider – your favourite passages of e.g. the Quran are held. – notion of big data – working backwards from original statistical method. We’re collecting too much data.
First issue – collect only what you need
– this is a general principle of data protection – blind faith by people collecting the data in anonymisation/scrubbing of databases – just collect it all – idea of anonymisation – can just fix everything and worry about it later – they feel there are technical tools to fix everything post facto. And they now can collect everything.
Perhaps we should think – worst case scenario
- for example, what would a genocidal dictator do?
- in the design community this is gathering way – turning to dystopian design fiction around products
- but need to translate it for an ordinary audience, make it user friendly
- often get polluted data because the data is not collected in the right way
There is a question of who makes these decisions when you design a project
- the user is often left out of the story – funders, development agency
- if you ask a user they would often say “take it all”, but are there actually things we wouldn’t collect, as responsible actors
- successful design is paternalistic
- but it’s not an either/or – methods sit together
At the design fiction end, there is a role for digital artists in provoking debates – e.g. Blast Theory – illustrating the dystopia These methods line up at different parts of the lifecycle At the level of the individual system, need to help people understand they can achieve things with less data – they can do with less. How do we build incentives to say the less you collect, the better? As data gets cheaper, what are the other benefits we can promote. Can we put a price tag on data collection as a minimum? Shifting incentives is important.
Summary of the discussion:
- Data minimisation – very important principle – why is it so threatened?
- Collecting data is cheap – the harm is externalised.
- Practical proposals to bring that principle back.
- Using analogies that people understand.
- Using technologies to e.g. bring class actions, bombarding agencies with freedom of information requests
- Right to be forgotten – debate between freedom of expression vs data protection people in civil society – divisive debate
Takeaways
- alternative project design strategies – dystopian design fiction, artists
- collecting best practices, documenting harm – a necessary step
- problem of data minimisation principle not being respected – what incentives are needed (e.g. agreement with funders to have certain baselines.