Site icon Sophos News

Interview: Sarah Jamie Lewis, Executive Director of the Open Privacy Research Society

This article is an interview with Sarah Jamie Lewis, Executive Director of the Open Privacy Research Society, a new privacy advocacy and research non-profit based in Vancouver, Canada.
Its goal is to make it easier for people, especially marginalized groups (including LGBT persons), to protect their privacy and anonymity online by helping app and technology firms more easily build privacy-by-default services via open source software that they’re spearheading.
We asked Sarah a few questions about the Open Privacy Research Society and the state of privacy in tech in general, and have reprinted her responses in full below.
What was the impetus for this project?
Last year I published a book, Queer Privacy, it’s a collection of essays written by people in queer and trans communities. While all the essays were ostensibly about technology, they cover broad topics like coming out, dating, sex work, intimate partner violence and even death and media representation.
It was a hard project to work on, but my goal was to finally start documenting how modern technology fails to protect the privacy, or uphold the consent of, marginalized people.
I’m not a fan of simply documenting though, and it’s no coincidence that Open Privacy emerged roughly a year after I finished the first cut of Queer Privacy.
I have had a year to sit and think about the kinds of technology we need to build, as well as the kind of organization we need to ensure that technology exists. And I’ve also had a year to find some amazing people to work with me and help guide that.
Can you give some examples of things that you saw and why they were problematic?
Real name policies are one example, forced account correlation is another, e.g. it is common for LGBT people in certain parts of the world to have two Facebook accounts: a regular one for family and work, and an “out” one which they use for dating/meeting other LGBT people.
These people have to go through a lot of effort to make sure that those accounts stay separate/unidentifiable, with one of the major risks being their “out” account showing up on a “people you may know” sidebar for one of their family members, and thus potentially outing them.
We have recently seen issues even in LGBT focused apps like Grindr, sharing location data insecurely or HIV status with third party platforms. This kind of non-consensual data sharing is something that should be impossible.
A key phrase in the Open Privacy Research Society mission statement is “We believe that moral systems enable consent.” Can you elaborate a bit on this?
We believe that it is possible to use technology to protect fundamental human rights. It is not just enough to give people the building blocks e.g. encryption – we need to build systems that actively help them achieve their goals and protect them from harm.
We define moral systems as those that protect people by default and are built to withstand abuse by those with malicious intent. We believe in systems that distribute power and resist attempts to centralize it.
Our priorities right now are on making metadata resistant communication platforms usable. We have good tools for protecting the content of communications, but we believe we can do better.
Technology shouldn’t be able to collect information on who someone is talking to, when they [are talking], where they are talking from etc. This kind of communications metadata is pervasive and enables corporations and governments to build surveillance and censorship systems.
We are working on an open protocol (Cwtch) that makes that kind of metadata collection impossible and allows us and others to build messaging apps, discussion forums, advertising boards or any other imaginable application in a way that is privacy-preserving in the truest sense of the word.
Cwtch is based on Ricochet which uses Tor onion services to provide peer-to-peer instant messaging without third parties. There is no one in a position to take data without consent because the only data shared is between you and the person you are talking with. Everything is as private as can be and metadata is kept as small as technically possible.
We’ve been working on a way to extend this concept to groups (which will be version 1 of Cwtch), and then eventually to higher level applications. The idea being that the data is controlled by you, and the only time you give it away is either directly to another person you trust or via privacy-preserving structures that only you and other people you trust have access to.
Why has so much of our tech failed to protect marginalized communities?
Quite frankly it’s because catering to the protection of marginalized communities is not aligned with the incentives of modern surveillance capitalism. There are countless examples of social networks, messaging apps, and other tools placing marginalized people at risk through simple ignorance.
What should organizations do to build better tools and privacy controls for all their users?
I think we have to give people tools and involve them in the research necessary to produce those tools. Too often we have produced technically brilliant tools that are unusable by those with limited time to devote to learning them. We have entire movements centered around training people how to use these tools, that is unsustainable.
Nothing about us, without us” isn’t just a catchy saying, it is a reminder that when we build technology we must involve as many voices as possible, from as many communities as possible. Only with those voices can we hope to build technology that protects the most vulnerable and the most marginalized.
Your career began in the context of government work and at a large corporation (Amazon), both of which are known for tracking citizens/consumers. What is it that caused you to not just walk away from that kind of work but choose to fight against it?
I believe that when you make a mistake, regardless of your intent, you must work to undo the damage you have caused. I helped build surveillance systems when I was early in my career. It is something I regret doing, and I think the only way to reduce the harm done by those actions it to build new systems and structures that resist censorship and surveillance. Systems and structures that enable consent. That’s what Open Privacy is and will be.



Exit mobile version