Jillian C. York

Jillian C. York is a writer and activist.

Page 2 of 186

On encryption, memory, and forgetting

Screen Shot 2015-02-09 at 11.45.44 AM

This is what an encrypted chat looks like after the fact.

“[W]e must never allow the future to collapse under the burden of memory.” – Milan Kundera, The Book of Laughter and Forgetting

Rare is the interview that brings tears to my eyes. This is not what I was expecting when I sat down for an interview with my friend, the journalist Lina Attalah, in Cebu City, Philippines, a couple of weeks ago. She had told me about the interview the evening before over snacks and drinks at our hotel’s rooftop bar and I, always a fan, had eagerly agreed. We were to talk about surveillance, the Internet, infrastructure.

Memory is a tricky thing. Over the past few years, I’ve been asked for “my story” many times. How did I come to what I do? Why the Middle East? And every time, I tell it differently. The anecdote about the Moroccan professor who was a harsh and hated teacher but when he talked about his homeland, his eyes lit up. How I skipped my own graduation to study Arabic the summer after because I couldn’t be arsed to get up at 8am during the school week. The first time Global Voices linked to my blog. The bloggers I met, knew, remember.

Memory is a tricky thing. I can tell that story plainly, or I can tell it through revelations I had later on, after I’d made the choice, after I sat on my kitchen counter that first day, peeling potatoes and wondering what had led me there. I can also tell the story of how I came to politics, to political thinking, but sadly, there are pieces missing, pieces of my own memory, pieces I can never retrieve.

I don’t remember who taught me how to use OTR, but I remember one of its biggest advocates. Alaa, a friend I met first in 2008 and who is now imprisoned in Cairo, was among a few friends who, early on, pushed me to use encryption. When I finally adopted it, our conversations opened up – not just with Alaa but with a number of people whose use of OTR—a way of holding ephemeral, encrypted chats that disappear when the window closes—made them feel safe. And it made me feel safe too: Not just from the watchful eyes of governments, but from those who might log, share, later embarrass me with my own naiveté.

Last April, in some other city, I sat with Lina on a bus, telling her about my sadness in conversations lost to OTR. During our interview, she brought that line of thinking back, asking me “Is the proliferation of a consciousness of privacy canceling out important narrative, important information?” She continued:

You mentioned to me once having had long encrypted chats with an activist over the years and now you have no access to these. That’s a record dropped. But there is an additional layer, which is that not only people use encryption but they actually end up not saying stuff altogether. The unsaid can be the product of privacy…

A record dropped. The unsaid as a product of privacy. Confronted with this idea, and the memories of conversations I will never recall, I found my eyes suddenly welling up with tears. Tears for the things I’ve learned, and all the things I’ve forgotten.

Ephemerality is a funny thing. Often, it frees us from the burden of memory; memory that can come back to haunt us. Unlike PGP, wherein encrypted emails are still store-able and unlockable with a private key, OTR erases our memories, allows our gabbing to be ephemera. This serves a purpose, but its unintended consequences, for me at least, often feel severe. I think back to when I first adopted the tool and wonder: Did I realize?

Becoming conscious of our privacy is a good thing. Consciousness is always a good thing. But as we become more conscious of the weight of our secrets, our hidden lives, are we forgetting the value of the archive?

There are other funding options than the USG

Four years ago, Sami Ben Gharbia wrote a piece that I had the privilege of editing, entitled “The Internet Freedom Fallacy and the Arab Digital Activism.” I was still relatively new to digital rights activism, and although my politics told me that taking money from the US government for what was pretty clearly democracy promotion was wrong, I was faced with the conflicting opinions of activist friends elsewhere in the world who agreed, but saw no other options.

That is, somewhat unfortunately, where I still stand. I do fundamentally believe that the State Department’s “Internet freedom agenda” is at heart an agenda of regime change, and have made no secret of that opinion. And yet I also sit on the advisory committee of the Open Technology Fund because I believe that, if this money exists, then we have the obligation to guide it in the right direction, rather than allowing it to be funneled to snake oil projects, groups that don’t accept criticism of their potentially risky tools, and other bad actors (you know who you are). But back to that in a moment.

Now, my intended audience for this post is probably folks who work in this scene and are also conflicted about the funding, but for the rest of you, here’s the crux of the issue: If you are opposed to surveillance, then you understand that we’re up against a multi-billion dollar industry that’s colluding with governments who want the utmost control over their populations. The amount of money spent on surveillance in the United States alone is mind-boggling. And we’re fighting with peanuts.

I believe that fighting surveillance requires a number of different approaches, and that one of those is the use of privacy-enhancing technologies. And it’s no secret that many of the privacy-enhancing technologies on which we currently rely, such as Tor and TextSecure, are funded by the US government.

Some folks have taken issue with this, going so far as to call Tor employees “government contractors.” On the one hand, this is pretty sensational talk: In much of Europe, for example, public funding of advocacy isn’t uncommon. On the other hand, there are real issues with implicitly supporting what is ultimately an imperialist agenda by taking US government funds.

Until recently, however, the alternatives to government funding have been minimal. Private foundations provide some funding, but from my vantage point, it seems they’ve mainly been supporting programming, perhaps out of the idea that tools are already well-funded. The solicitation of donations from projects’ websites hasn’t brought in much.

Because of this simple fact, I’ve reconciled my views on the matter to conclude that the funding of free and open source technology by governments is relatively benign, in that these technologies are inherently neutral and used by individuals with a range of political views, despite the goals of funders. In other words, the State Department might fund Tor because it helps Iranians strike against their government, but there’s nothing they can do to stop it from being used by anarchists, American dissidents, etc. I give a lot of credit to friends like Amr Gharbeia for informing my views on this subject.

Still, though, the question nags at me: Isn’t there a better way? In the fall of 2012, I sat at a long table in a San Francisco restaurant as the Freedom of the Press Foundation was conceived (I contributed nothing to the idea, I just happened to be there). While the original idea was to create a method for crowdfunding WikiLeaks that was less likely to be meddled with by governments, the project has expanded to become a rather large funder (through crowdsourcing) of privacy-enhancing technologies. This strikes me as the first step in an increasingly promising direction.

 

PrL-end

 

Yesterday, something pretty amazing happened. Journalist Julia Angwin wrote a piece about Werner Koch, the main developer behind GnuPG, the free and open source version of PGP, a program that enables email and file encryption. Koch, wrote Angwin, is “running out of money and patience with being underfunded.” Despite considerable adoption of the tools after the Snowden revelations in 2013. I spotted the article in a tweet from my friend Trevor Timm, who posted it and urged his followers to donate to Koch’s ongoing fundraising campaign.

This morning, I awoke to find that Koch’s campaign had exceeded its goal of 120,000 €. Each time I refresh the page, the number continues to rise – from 166,000 € when I started writing this piece to 168,000 € now. Facebook and payment service Stripe have each pledged 50,000 € per year. The Core Infrastructure Initiative granted Koch $60,000. On the donations page, Koch wrote today:

As the main author of GnuPG, I like to thank everyone for supporting the project, be it small or large individual donations, helping users, providing corporate sponsorship, working on the software, and for all the encouraging words.

GnuPG does not stand alone: there are many other projects, often unknown to most people, which are essential to keep the free Internet running. Many of them are run by volunteers who spend a lot of unpaid time on them. They need our support as well.

This story is heartwarming, and I hope to see more like it in the near future. But this story brings to light one of the main problems with US government funding: It preferences new and untested tools over those that have been around for some time, used by experts in the field, audited, and proven to work. For every Tor or TextSecure (both of which are audited and work demonstrably well), there are several other tools or projects that receive funding and either fail, or fail to keep people safe.

A friend who asked to remain anonymous has raised this issue with program officers at the State Department, and has been told that such projects should simply “submit a proposal.” The problem with that, of course, is that many of these developers are underfunded and/or unequipped to deal with the bureaucracy of the proposal system, let alone the budgeting required to apply for such a large sum (if I recall correctly, the Department of Democracy, Rights, and Labor only gives grants of $500,000 or more).

Although some projects, such as Radio Free Asia’s Open Tech Fund (full disclosure: I am an advisor), have sought to rectify part of this problem by providing smaller grants to smaller projects, no funder has entirely succeeded in bringing in older projects (like GnuPGP, KeePassX, Pidgin, Adium, Enigmail) which have been in need of ongoing support.

I talked to my friend Samir Nassar, a security trainer, who told me that the “lack of funding for projects like GnuPG enforces a conservatism with a developer’s time. When we approach projects to point out usability problems that don’t easily fit into the traditional bug-fixing methods, we are asking more of the developers than they have time to give. It takes time to show them what the issue is, why it is an issue, and how to fix it—time that developers rarely have because they are unpaid.”

Samir’s comments demonstrate that there are not only political considerations regarding US government funding, but practical ones as well.

It pains me to say this, but this is not an ideal world that we live in, and therefore I cannot stand as strongly against the US Internet freedom agenda as I would like, lest it result in the defunding of all of these important projects. I do, however, think that it’s our duty to ensure that these projects and tools have alternative revenue streams, so that we can cease to be dependent on a pot of money that is most often in direct contradiction to our goals.

 

PrL-end

 

Despite a lack of attention, many of the projects we mention in this piece accept donations. The following are a few that you can donate to:

  • Adium
  • Instant Messaging Freedom, Inc. (supports Adium, Pidgin, Finch, Vulture, libpurple)
  • KeePassX (see donate button on lower left side of page)
  • KDE
  • Update 2/7: A reader writes in to add OTR, saying: “People see and use the clients  that integrate it, but often forget that it’s a separate project that also needs love.”
  • Update 2/8: @ageis suggests adding GPGTools.

Did I leave something out? Shoot me an email or let me know in the comments.

Notes from a talk at WerkstattB, Berlin

This is a loose talk that I wrote for a Women’s CryptoDinner at Thoughtworks’ WerkstattB in Berlin, February 4, 2015. I deviated considerably from the notes, but thought I would share them anyway.

I don’t believe that we’re doing enough right now to ensure that everyone out there has the ability to take responsibility for their online security and safety.

I’m guessing most people in this room understand why surveillance is terrifying and encryption is important. But just in case, I’m going to go over this for a few minutes.

Here’s the short version: We’ve reached a point in time where states and corporations are working together to spy on every one of us. What my government is doing, in cooperation with other governments, is mass, dragnet surveillance. There are mass amounts of data being collected on each and every one of us that uses the Internet or a mobile phone. And even for those rare few who don’t, the fact that our contacts are using or carrying these tools or devices means that you too are vulnerable.

Now, I am pessimistic about the state of governance in the world, but nevertheless, I believe that this is an issue that we have to tackle from a number of directions:

-Policy work
-Litigation
-Education
-Technology

It’s that last one that I want to talk about right now. Technology isn’t more important than those other fights. It’s not our sole savior. The amount of funding we have for privacy and security tools is nothing compared to the amount of funding our governments have put aside for spying on us.

But, the one thing technology has that those other fights don’t is that it’s the only element of the fight against surveillance that allows us to take personal responsibility for our actions. We can’t engage in the policy fight alone, as individuals. Litigation takes an army. Education is a slow struggle. But by using privacy-enhancing technologies, each one of us can take responsibility for our own safety.

In a talk I gave with Jake Appelbaum last year, we likened this to using a condom for safer sex. The analogy is imperfect, but the idea behind it is the same: Harm reduction. In using condoms, we are minimizing the threats that sexual contact can pose. The same goes for privacy-enhancing technologies: We cannot protect ourselves perfectly, but we can minimize the threat of surveillance.

So let’s talk about harm reduction for a minute. Harm reduction is typically defined as a set of practical strategies and ideas aimed at reducing negative consequences. When we talk about HIV, this means educating the public on the risks of unprotected sex and how to mitigate those risks. But it also means being realistic about human behavior: we know people will continue to engage in sexual activity, so we must meet them where they are and offer them practical solutions to avoid infection.

This is also true for digital security, though it hasn’t always felt that way to me. In 2009, I was attending an event, a training, and was spacing out a little bit when a guy walked up behind me, showed me a piece of paper and asked “Is this your password?” It was, indeed, and not only that, but it was a rather embarrassing password made from someone’s name (they were in the room) and some numbers. The password was for my Tweetdeck installation, and at the time, Tweetdeck wasn’t using SSL, so my password was available in plaintext.

That event scared the crap out of me. For me, a naturally competitive person who likes a challenge, that was a good thing: I quickly read up on encryption, attended a training, and learned how to use OTR. But that sort of strategy doesn’t work for everyone, and for some it can be counterproductive: There are many who understand the risks and choose to ignore them (just as some choose not to use condoms or get tested, out of fear).

And so I’ll say it again: We need to meet people where they are. This means, first, not scaring them. It also means helping them to understand the threats they face and respond appropriately.

Sure, I wish that everyone would be concerned enough about the NSA’s surveillance that they would adopt these technologies and fight back. But the truth is, there are a lot of people who just aren’t going to care about that to the same degree. There are people who aren’t concerned about their devices being searched at borders. In this age, that doesn’t mean they have nothing to hide, but it may mean they perceive the NSA to be less of a threat to them than I do.

If that doesn’t sound right, think about a different example: Coca-Cola has successfully protected their recipe for more than a hundred years. That’s a company that, for better or worse, takes information security seriously. Now, I suspect that the recipe is somewhere in a vault on a piece of paper, but disregard that for a moment. The point is: They have a vested interest in caring about security.

Now, there are other companies that care less. Converse is a great example: There are thousands of companies ripping off their brand, and Converse doesn’t give a fuck: They just keep making their iconic sneakers, and people keep buying them.

This is going to be true for individuals as well. And that’s why threat modeling is important in security: Just like a doctor shouldn’t prescribe antibiotics for every person who enters their office with a cough, neither should we prescribe the same solutions to every individual who expresses concern about their safety. Instead, we should guide them through asking themselves some questions, to understand their habits and their risks:

What do you want to protect?
Who do you want to protect it from?
How likely is it that you will need to protect it?
How bad are the consequences if you fail?
How much trouble are you willing to go through in order to try to prevent those?
Asking these questions helps us prescribe solutions to people.

But it’s also important for us to remember that those solutions aren’t always the best fit for everyone. PGP, for example, isn’t easy, and pretending it is serves no one. It took me two years of using it regularly to feel confident in my use, and I’m still learning new tricks all the time. Instead of starting with PGP (which is a bit like starting with calculus for someone who hasn’t yet studied algebra), we can start them with simpler tools like TextSecure and encourage them to have their conversations there.

One final note: One problem in the equation is that, for years, the field of digital security has been dominated by technologists. I mean, of course it has…but for many years, I felt that this community was rather exclusive. There used to be this feeling that if you didn’t protect yourself, then you deserved to get owned. Just last summer I gave a talk and said “PGP is hard,” and got verbally attacked afterward by a young man who thought that was too discouraging of me. He argued that PGP isn’t hard. I disagree.

This part of the solution isn’t personal: In order to reach more people, we must, we absolutely must make these technologies easier to use. There are a lot more folks thinking about this now than there were just a few years ago, which is great, but we have further to go. For those of us that have it, we can put money toward this goal by donating to organizations like Open Whisper Systems. For the designers in the room, you can offer your help to technologists, to make their products more easily understood by users. And technologists, you can do your part by remaining open to feedback, conducting rigorous user testing, and increasing collaboration. This way, we all win.

« Older posts Newer posts »

Creative Commons License
Jillian C. York by is licensed under a Creative Commons Attribution 4.0 International License.

Theme by Anders NorenUp ↑