You are browsing the archive for 2014 November.

Challenging the ‘Open by Default’ Principle at Open Up? 2014

- November 20, 2014 in Uncategorized

The Personal Data & Privacy Working Group was invited (along with 17 others) to present our work in a demonstration booth at the well-attended Open Up? 2014 event at the Dutch Center last week (November 12) in London. It was a great opportunity for my colleagues and I to interact with folks and  introduce the WG to new contacts, specifically highlighting some of the activities we have undertaken over the course of the past few months, and give them some takeaways in the form of the Open Data and Privacy primer, etc.

However, it was also obvious that for most of the 170+ attendees, the main event with its impressive and diverse line-up of speakers was a huge attraction.  Open Up?  2014, the second in a series organised by Omidyar Network (since it partnered with DFID to host it two years ago) saw a well-thought-out agenda where all the relevant issues were discussed- from the supposed tensions between openness and privacy, to the very important and interrelated issues of  transparency and trust, both within government and the private sector towards data collection and handling. In the build-up to the big event, there were a couple of interesting and thought-provoking blog posts written by some of these notable speakers. In particular, I highlight here the article (titled Opening Policy, Protecting Privacy) written by Tim Hughes where he raises some vital questions on how  governments use and should use the personal data at their disposal, while Sunil Abraham attempts to resolve the dichotomy between privacy and openness in this post.

Demonstration booths at open up

Demonstration booths at Open Up? 2014. Photo credit: Omidyar Network

The line up did live up to its billing and delivered in terms of the diversity of opinions and country-specific experiences that were shared. While much of the discussion at Open Up? 2014 centered around this particular community of interest groups and persons, I thought it was interesting how playwright  James Graham’s opening session highlighted how ordinary folks (the UK cinema audience to be exact) are taking an interest in privacy and surveillance and what these mean to them  in the age of ubiquitous technology and access to information. There is a valid concern here that often  we do not know what we are in fact ‘sharing’ via social media, services apps etc. and it’s good to see folks outside this (quite small) community engaging with this issue.


Back to basics on what should be open

So what is the current stand on open (data) by default? It is interesting to see that it’s back to basics about what data (set) should be open and what should not be open. It is fair to say there was no consensus reached on this.  However, we did make great inroads in questioning the implications of ‘open by default’ for the respect for human rights and privacy, especially in non-democratic  contexts where loss of privacy has particularly grave consequences.

While most people are still comfortable with the idea of ‘open’, what they are also asking for is a level of mandatory transparency by governments and corporate bodies about what is being collected and why. In a poll conducted during the event, poll 86% of the audience were of the view that their government is not making all data collection activities known. One speaker also called for an inventory of the scale of mechanisms on surveillance especially. It seems therefore that governments and corporate bodies making more information available to individuals is vitally important, but it’s only a necessary first step. It also cannot be overemphasized that the open data community needs to be more specific and emphatic about what is open data and what isn’t, especially since other government data-sharing activities can sometimes be misconstrued as such.

There are other relevant issues that are not so easily resolved either.  There is the concern that big data is increasingly under the control of only the state or a powerful few. There are quite scary future implications of this, and one of the speakers raised the point that  a culture of perceived low digital literacy among an important stakeholder group (i.e. policymakers) makes it difficult for them to  grasp the enormity of what this entails for any functioning democracy.

James Graham at OpenUp14

Playwright James Graham making the first presentation for the day. Photo credit: Omidyar Network

last panel for the day

Sunil Abraham, Timothy Garton Ash, Richard Allan and Pria Chetty discussing data collection and the private sector. Photo credit: Omidyar Network

There were a couple of surprisingly frank opinions on the floor. One speaker noted that some amount of surveillance is necessary, though in ‘tiny bits’. It is unclear what qualifies as tiny bits. Some others were of the of the view that there should be some limits to the the amount of (government) information that should be openly available, especially in the light of valid concerns about national security. Once again, where exactly we should draw the line between mandatory disclosure/transparency and  security/privacy is not always apparent.

As Stephen King (a partner at Omidyar Network) said in the closing remarks, the event intended to bring stakeholders from the different sectors together, and I think that is an extremely vital thing to do as it enables us to look at these thorny issues from different perspectives.


The photos and the videos from Open Up? 2014 have been made available, and more information can also be found via the event twitter feed (#OpenUp14).

Responsible data in development

- November 17, 2014 in Uncategorized


In mid-October, on behalf of the Personal Data & Privacy WG, I joined a group of development data experts from around the world, to co-author a book on the topic of responsible data in international development. We launched the first version of the book just 4 days after we met, and it is now available for download here.

We used the ‘booksprint’ method to produce the book, which involved bringing together people from a variety of sectors, to write the book from start to finish, in three days. We had no preparatory work laid out beforehand, and we were lucky to have the process was facilitated by Barbara Ruehling, who has done this many times before, on a range of topics.

Having people from different backgrounds to work with on the book brought a great richness to the text itself; too often, the conversation happening within the digital security crowd is a world away from the discussions within the international development movement. But this time, we had digital security experts, collaborating intensely with development practitioners, data nerds, and privacy advocates, to produce the book.

We produced the book as a first attempt to understand what ‘responsible data’ might mean within the context of international development programming. We decided to take a broad view of ‘development’, and, hopefully, some of the information within the book will also be useful for those working in related fields, such as human rights defenders, or activists.

We decided to focus, however, on ‘development’ due to the growing hype around the ‘data revolution’, with the UN Secretary General’s Data Revolution Group releasing just last week their report, A World that Counts. We see potential for harm that accompanies data and technology within the development context, which is too often ignored, and we wanted to focus on this.

The authors of this book believe that responsibility and ethics are integral to the handling of development data, and that as we continue to use data in new, powerful and innovative ways, we have a moral obligation to do so responsibly and without causing or facilitating harm. At the same time, we are keenly aware that actually implementing responsible data practices involves navigating a very complex, and fast-evolving, minefield – one that most practitioners, fieldworkers, project designers and technologists have little expertise on. Yet.

The team behind the book was:
Kristin Antin (engine room), Rory Byrne (Security First), Tin Geber (the engine room), Sacha van Geffen (Greenhost), Julia Hoffmann (Hivos), Malavika Jayaram (Berkman Center for Internet & Society, Harvard), Maliha Khan (Oxfam US), Tania Lee (International Rescue Committee), Zara Rahman (Open Knowledge), Crystal Simeoni (Hivos), Friedhelm Weinberg (Huridocs), Christopher Wilson (the engine room), facilitated by Barbara Rühling of Book Sprints.

The book is available for download now, under a CC-BY-SA license; please feel free to remix and reuse it.

It is also catalogued as a relevant resource on the Personal Data & Privacy WG wiki.

Reflections on #Mozfest 2014

- November 3, 2014 in Uncategorized

From 24th to 26th October, I participated in my first ever Mozilla festival. With 11 tracks ranging from  Build and Teach the Web, to Community Building, and numerous parallel sessions and activities, it was quite difficult deciding where to be at any one time. Like most of the other tech events I had attended this year, the message that the web offers tremendous potential resonated deeply in all the presentations and keynotes.

However, I was also very much interested in other discussions which interrogate the ‘other sides’ of this hype. Hence, I decided to participate in the sessions on how Mozilla is championing the privacy policy and advocacy movement, as well as those run by Web We Want and the Electronic Frontier Foundation (EFF). As I thought, the Souce Code for Journalism sessions were also quite enlightening and I’m glad I stopped by those as well. Some of the topics that stood out for me are:


The Web We Want wall at the event. Photo credit: Mozilla in Europe

Digital rights

The work being done in advancing this area is outstanding. Mention can be made of the Web We Want and EFF’s latest campaigns as well as the iRights platform being championed by Baroness Beeban Kidron which advocates for young people to have access to security and privacy on the web.

Though I have participated in a number of discussions where the issue of surveillance and privacy on the web  was on the agenda, this was my first time where I had the opportunity to listen to first-hand accounts from  people who have had a personal experience of one form or the other, either by being affected themselves , or working with or supporting those who do. Though it was obvious that quite a few of us were out of our depths with regard to the practicalities of how to support ‘victim’s’ of internet surveillance, it is good to consider that there is a whole spectrum of needs and corresponding support to give (for example financial, technical and moral support), so it was concluded that each of us can and should play a part.


Baroness Kidron

Baroness Kidron delivering her keynote. Photo credit: Mozilla in Europe

Also, my sense is that there are quite a number of us who agree with Baroness Kidron that the web has not delivered on its promise of  fairness and equality for all. And while we recognise that the (promising) future of the web which advances these ideals needs to be actively campaigned for, we might not identify strictly as activists and this can affect the stake we perceive as having in this space.

Closely related to this is the idea that  promoting web literacy for all is crucial to securing privacy and security on the web. This requires a more than basic understanding of how surveillance works, the philosophy behind it and the tools (mostly technological but also very much policy-related) that are available to help circumvent them. The session jointly organised by Privacy International (PI), EFF and Access, as well as the one on  Do Not Track by Justin Brookman were therefore particularly useful in expanding our knowledge of what surveillance is, in my opinion.

Data: open and inclusive

The idea that data is not end in itself but is a tool that must be harnessed towards effective change was yet again advanced at Mozfest 2014. In particular, in one of the sessions in the Source Code for Journalism track led by Laurenellen McCann, we explored first and foremost what open data means and looks like in different contexts. But more importantly, we discussed what it means to have a community involvement in open data, and if it’s at all possible or practical to have community-driven strategies. And finally, we discussed ideas around digital inclusion and what it would take to make open data more inclusive than it currently is.

Other discussions in the space identified that sometimes and in certain contexts, open data is just a box to be ticked. However,  open data systems must be truly open (according to the open definition), and must also be consistently updated with the necessary data sets. That said, consideration must be given to users who come from different contexts with different needs and capacities- so inclusiveness must be balanced with openness.


Both opening and closing fairs featured many innovative ideas and products. Photo credit: Mozilla in Europe

Things to watch out for

We all have our individual takeaways from Mozfest 2014, but for me some of the things that sound exciting and which I would be looking out for include the Privacy 101 platform which is to be launched by PI by close of the year. Privacy 101 will make available all the wealth of information and resources generated so far by PI (and its partners) from various activities across the world.

I will also be watching out for Mozilla’s work exploring specific potential tech solutions that promote privacy protection. For example, these (solutions) will be built into the browser, and which more importantly does not optimize usability over privacy and vice versa. I will also be looking to support the Mozilla Advocacy platform to engineer potential community-driven policy solutions to internet tracking.

And finally, one of the items we worked on was a transparency code for data journalism. The intended output is a manifesto (of sorts) with a set of principles to guide news organisations on how to be transparent about their data projects. I made the case that at the minimum, the principles should have considerations for data privacy and inclusiveness, and I look forward to seeing how this is done.