Categories
Uncategorized

Notes from a talk at WerkstattB, Berlin

This is a loose talk that I wrote for a Women’s CryptoDinner at Thoughtworks’ WerkstattB in Berlin, February 4, 2015. I deviated considerably from the notes, but thought I would share them anyway.

I don’t believe that we’re doing enough right now to ensure that everyone out there has the ability to take responsibility for their online security and safety.

I’m guessing most people in this room understand why surveillance is terrifying and encryption is important. But just in case, I’m going to go over this for a few minutes.

Here’s the short version: We’ve reached a point in time where states and corporations are working together to spy on every one of us. What my government is doing, in cooperation with other governments, is mass, dragnet surveillance. There are mass amounts of data being collected on each and every one of us that uses the Internet or a mobile phone. And even for those rare few who don’t, the fact that our contacts are using or carrying these tools or devices means that you too are vulnerable.

Now, I am pessimistic about the state of governance in the world, but nevertheless, I believe that this is an issue that we have to tackle from a number of directions:

-Policy work
-Litigation
-Education
-Technology

It’s that last one that I want to talk about right now. Technology isn’t more important than those other fights. It’s not our sole savior. The amount of funding we have for privacy and security tools is nothing compared to the amount of funding our governments have put aside for spying on us.

But, the one thing technology has that those other fights don’t is that it’s the only element of the fight against surveillance that allows us to take personal responsibility for our actions. We can’t engage in the policy fight alone, as individuals. Litigation takes an army. Education is a slow struggle. But by using privacy-enhancing technologies, each one of us can take responsibility for our own safety.

In a talk I gave with Jake Appelbaum last year, we likened this to using a condom for safer sex. The analogy is imperfect, but the idea behind it is the same: Harm reduction. In using condoms, we are minimizing the threats that sexual contact can pose. The same goes for privacy-enhancing technologies: We cannot protect ourselves perfectly, but we can minimize the threat of surveillance.

So let’s talk about harm reduction for a minute. Harm reduction is typically defined as a set of practical strategies and ideas aimed at reducing negative consequences. When we talk about HIV, this means educating the public on the risks of unprotected sex and how to mitigate those risks. But it also means being realistic about human behavior: we know people will continue to engage in sexual activity, so we must meet them where they are and offer them practical solutions to avoid infection.

This is also true for digital security, though it hasn’t always felt that way to me. In 2009, I was attending an event, a training, and was spacing out a little bit when a guy walked up behind me, showed me a piece of paper and asked “Is this your password?” It was, indeed, and not only that, but it was a rather embarrassing password made from someone’s name (they were in the room) and some numbers. The password was for my Tweetdeck installation, and at the time, Tweetdeck wasn’t using SSL, so my password was available in plaintext.

That event scared the crap out of me. For me, a naturally competitive person who likes a challenge, that was a good thing: I quickly read up on encryption, attended a training, and learned how to use OTR. But that sort of strategy doesn’t work for everyone, and for some it can be counterproductive: There are many who understand the risks and choose to ignore them (just as some choose not to use condoms or get tested, out of fear).

And so I’ll say it again: We need to meet people where they are. This means, first, not scaring them. It also means helping them to understand the threats they face and respond appropriately.

Sure, I wish that everyone would be concerned enough about the NSA’s surveillance that they would adopt these technologies and fight back. But the truth is, there are a lot of people who just aren’t going to care about that to the same degree. There are people who aren’t concerned about their devices being searched at borders. In this age, that doesn’t mean they have nothing to hide, but it may mean they perceive the NSA to be less of a threat to them than I do.

If that doesn’t sound right, think about a different example: Coca-Cola has successfully protected their recipe for more than a hundred years. That’s a company that, for better or worse, takes information security seriously. Now, I suspect that the recipe is somewhere in a vault on a piece of paper, but disregard that for a moment. The point is: They have a vested interest in caring about security.

Now, there are other companies that care less. Converse is a great example: There are thousands of companies ripping off their brand, and Converse doesn’t give a fuck: They just keep making their iconic sneakers, and people keep buying them.

This is going to be true for individuals as well. And that’s why threat modeling is important in security: Just like a doctor shouldn’t prescribe antibiotics for every person who enters their office with a cough, neither should we prescribe the same solutions to every individual who expresses concern about their safety. Instead, we should guide them through asking themselves some questions, to understand their habits and their risks:

What do you want to protect?
Who do you want to protect it from?
How likely is it that you will need to protect it?
How bad are the consequences if you fail?
How much trouble are you willing to go through in order to try to prevent those?
Asking these questions helps us prescribe solutions to people.

But it’s also important for us to remember that those solutions aren’t always the best fit for everyone. PGP, for example, isn’t easy, and pretending it is serves no one. It took me two years of using it regularly to feel confident in my use, and I’m still learning new tricks all the time. Instead of starting with PGP (which is a bit like starting with calculus for someone who hasn’t yet studied algebra), we can start them with simpler tools like TextSecure and encourage them to have their conversations there.

One final note: One problem in the equation is that, for years, the field of digital security has been dominated by technologists. I mean, of course it has…but for many years, I felt that this community was rather exclusive. There used to be this feeling that if you didn’t protect yourself, then you deserved to get owned. Just last summer I gave a talk and said “PGP is hard,” and got verbally attacked afterward by a young man who thought that was too discouraging of me. He argued that PGP isn’t hard. I disagree.

This part of the solution isn’t personal: In order to reach more people, we must, we absolutely must make these technologies easier to use. There are a lot more folks thinking about this now than there were just a few years ago, which is great, but we have further to go. For those of us that have it, we can put money toward this goal by donating to organizations like Open Whisper Systems. For the designers in the room, you can offer your help to technologists, to make their products more easily understood by users. And technologists, you can do your part by remaining open to feedback, conducting rigorous user testing, and increasing collaboration. This way, we all win.

Leave a Reply

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.