Baking Privacy into ICT4D

Some issues we could talk about/questions to ask:

1. Life cycle of development projects with regards to when to do collection, storage,etc. Willow just finished a life cycle as an output of the RDF Oakland- we could build on that.

2. What is specific about development context? Are there particular things in development that make it different and how do we address those? – one way to approach that is try to think of what the prototypical challenges might be stand out more in development.

3. Impact assessment- when you are rolling out an development project are you thinking about what the impact might be in sense of intended consequences? What are the artefacts and legacies you leave behind that could be repurposed?

4. Are there baselines of any universal conditions and attributes that we can use across every context? Are there red lines? Certain kind of data that we never should collect? Or is it all context dependent?

5. Harm stories and failures- how it went spectacularly wrong?

6. Are there particular technologies or programs that have a higher risk in terms of prioritizing funding? – How do you make sure that is not just a box that you tick off because often privacy by design ends up a box-ticking thing.

7. Where would we want funders to be? What we want funders to do? As a matter of policy. – criteria based policy: projects need to do x-y-z – carrots: highlighting projects for inclusion of privacy – separate funds support to inject privacy/capacity Mechanism that could come in as appropriate as grantees to work with it.

Important points to note:

  • Capacity and familiarity of funders. The engine room just interviewed funders. Nobody feels that these questions are raised enough. Broader questions on ethics and privacy are not addressed. They do not think of them as data-projects. As water project etc. But everything is a data-project!
  • The most progressive donors have the most hands-off approach. This is a tension, but you are always being paternalistic in some sense, also if you say leave it up to the people and they decide everything is fine.
  • Developing countries do not have the legislations but what do we do in the UK in order to fulfil the data-protection act and could we bring these to funders.
  1. IT system goes to an audit by a security editor
  2. Data-processing agreements with suppliers that are matching the data-centres.


  • When conditionality is most appropriate and when a standing offer is appropriate. If donors have a standing fund for responsible data support then there is money for an audit as well.
  • Can we make interventions: hey this is a good project, a safe platform eg and take what we have learned.
  • General audit of most popular tools risks or concerns (frontline sms, magpie or ODK). Here are some questions you should think about before you go further needs to be integrated in to a platform for functionalities what you want.
  • If donors had a list of the most popular 35 tools with the responsible data risks and concerns- a cheat sheet for popular tools.
  • Due diligence on the infrastructure. This creates an incentive and it doesn’t require institutions to change policy but PO can use it.
  • Open integrity Index: ilab
  • Rebecca MacKinnon’- Ranking Digital Rights. Mini offshoot in the development sector. Tools change so quickly it needs to be kept up-to-date. We are interested in Granularity . Potential for something like that requires research.
  • Matrix for Oxfam Novib on tools and MAVC mapping of who uses most tools. So within a few months we have the most relevant tools might have greater uptake if it has something about functionalities as well.
  • Community to review such a tool is willing and able. We could try to get funded some research for the Open Integrity Index to make a draft on the basis of our tools and could follow up with the group of donors we interviewed.
  • Best would be individual consultant for every project to help think it through- maybe not feasible. But a set of questions and ideas would be good. All your grantees/new grantees have a training. There appears to be very little- few people do training on these issues; it is ad hoc, no coordination or sharing of methodologies. Only 3 donors that have a module. Huge room for improvement and coordination. Make it available- the training. Classic example: The Engine Room granter OSF- we needed help managing our governance policy and they helped us to set up assistance with the audit department. Offer the expertise.

Final thoughts:

Funders themselves don’t think about these issues- they don’t know what questions to asks. They are required by their policy to send
At the moment the privacy risks that exists- to make people aware of misuse of data; are there other ways then training to do something about it. Is there anything that can be done in the funding cycle to mitigate these threats?
It needs to be genuinely helpful. It is better than nothing to have a baseline that people have to consider. If you at least get people to tick the boxes.
You can’t force people to see the relevance of privacy all of a sudden – and you can make people check the boxes. Funders want to be hands-of for reasons of respect and ownership- but sometimes it’s a cop-out not to have to spend to much energy on certain projects.
Focus to have tools for funders for PO’s to use that are motivated.
We should have this conversations with other funders round the table and ask them.

3 key points:

  • Awareness: everything is a data project.
  • Incentives: competing and perverse incentives in funders relationships – box checking exercise. Most immediate activity is to provide PO’s with resources: cheat-sheet with risks and dangers of certain tools, and can get into a conversation with their grantees.
  • Sharing of information across organisations and get funders together in a safe environment: Funderdome.

Comments are closed.