Podcast Episode 109
As humans, we’re constantly changing and evolving, always trying to become our best selves. But the internet will never forget about us at our lowest. One simple search or purchase online can cause the internet, specifically advertisements, to haunt you with products forever, no matter how much you change.
On this episode of How to Fix the Internet, Ethan Zuckerman, a long-time friend and tech pioneer, joins EFF’s Cindy Cohn and Danny O’Brien to discuss ways to fix surveillance advertising and online speech to make the internet a better place for everyone.
Click below to listen to the episode now, or choose your podcast player:
You can also listen to this episode on the Internet Archive and on YouTube.
One of the challenges with fixing the internet is that we don’t have enough information about what is wrong. Many big companies don’t allow researchers to gather information and data about their platforms. From cease and desist orders to private APIs, researchers don’t get the chance to find out what is precisely wrong with the internet, let alone how to fix it.
If we, as human beings are allowed to change and evolve, we have to find some way to be able to outgrow our data doppëlgangers.
But, even without this research, we know that speech online needs to be re-imagined. Ethan helps walk us through a world where we can have better speech online, with less censorship from corporations, the government, and ourselves.
In this episode you’ll learn about:
- The challenges researchers face when gathering information and data about our relationship with social media platforms.
- Different ways to communicate with groups online and how these alternatives would improve online speech.
- Ways that third parties have tried to give more user control in social media platforms.
- How censorship, and who we worry about censoring speech, has changed as the internet has evolved.
- The problems with surveillance advertising and alternative ideas for advertisements on the internet.
- How the Computer Fraud and Abuse Act blocks research and innovation, and how we can fix it.
- How communication on the internet has changed over time, why social media giants aren’t getting it right, and how to move forward.
Ethan Zuckerman is an associate professor of public policy, communication and information at the University of Massachusetts at Amherst. He is also the founder of the Institute for Digital Public Infrastructure, a research group that is studying and building alternatives to the existing commercial internet. He is also the co-founder of Global Voices and works with social change nonprofits around the world. You can find Ethan on Twitter @EthanZ.
If you have any feedback on this episode, please email podcast@eff.org.
Below, you’ll find legal resources—including links to important cases, books, and briefs discussed in the podcast—as well as a full transcript of the audio.
This podcast is licensed Creative Commons Attribution 4.0 International, and includes the following music licensed Creative Commons Attribution 3.0 Unported by their creators:
Come Inside by Zep Hurme (c) copyright 2019 Licensed under a Creative Commons Attribution (3.0) Unported license. http://dig.ccmixter.org/files/zep_hurme/59681 Ft: snowflake
Perspectives *** by J.Lang (c) copyright 2019 Licensed under a Creative Commons Attribution (3.0) Unported license. http://dig.ccmixter.org/files/djlang59/60335 Ft: Sackjo22 and Admiral Bob
Xena's Kiss / Medea's Kiss by mwic (c) copyright 2018 Licensed under a Creative Commons Attribution (3.0) Unported license. http://dig.ccmixter.org/files/mwic/58883
Drops of H2O ( The Filtered Water Treatment ) by J.Lang (c) copyright 2012 Licensed under a Creative Commons Attribution (3.0) Unported license. http://dig.ccmixter.org/files/djlang59/37792 Ft: Airtone
reCreation by airtone (c) copyright 2019 Licensed under a Creative Commons Attribution (3.0) Unported license. http://dig.ccmixter.org/files/airtone/59721
Resources
Transparency and Fair Use:
- Victory for Fair Use: The Supreme Court Reverses the Federal Circuit in Oracle v. Google (EFF)
- Once Again, Facebook Is Using Privacy As A Sword To Kill Independent Innovation (EFF)
- Facebook's Attack on Research is Everyone's Problem (EFF)
Government Censorship and Intermediary Censorship:
- Courts to Government Officials: Stop Censoring on Social Media (EFF)
- Right or Left, You Should Be Worried About Big Tech Censorship (EFF)
- Content Moderation is Broken. Let Us Count the Ways (EFF)
Privacy of Personal Data:
- Facebook's New Privacy Changes: The Good, The Bad, and The Ugly (EFF)
- Clearview AI—Yet Another Example of Why We Need A Ban on Law Enforcement Use of Facial Recognition Now (EFF)
Surveillance Capitalism:
- Google Says It Doesn't "Sell" Your Data. Here's How the Company Shares, Monetizes, and Exploits It (EFF)
- Grindr and OKCupid Sell Your Data, but Twitter's MoPub Is the Real Problem (EFF)
- Behind the One-Way Mirror: A Deep Dive Into the Technology of Corporate Surveillance (EFF)
- Data Brokers are the Problem (EFF)
- Why Getting Paid for Your Data Is a Bad Deal (EFF)
Digital Public Infrastructure (One Series):
- Introducing the Public Interest Internet (EFF)
- The Enclosure of the Public Interest Internet (EFF)
- Outliving Outrage on the Public Interest Internet: The CDDB Story (EFF)
- Organizing in the Public Interest: MusicBrainz (EFF)
- The Tower of Babel: How Public Interest Internet is Trying to Save Messaging and Banish Big Social Media (EFF)
- From Bangkok to Burlington—The Public Interest Social Internet (EFF)
Advertisement:
Computer Fraud and Abuse Act (CFAA):
- Computer Fraud and Abuse Act Reform (EFF)
- Supreme Court Overturns Overbroad Interpretation of CFAA, Protecting Security Researchers and Everyday Users (EFF)
- Federal Judge Rules It Is Not a Crime to Violate a Website's Terms of Service (EFF)
Transcript
Ethan Zuckerman: I'm an alcoholic. I was a heavy drinker for more than 20 years. I got sober about four years ago, but the internet knows me as an alcoholic and there is in those many records out there, the fact that I have clicked on alcohol ads. I have bought alcohol online. The internet in a very real way doesn't want me to stop drinking. The fact that they know that I like to drink is actually very lucrative for them. When you think about this, this creates a really interesting ethical conundrum. It's not just that these things are creepy. It's that they're literally holding us to our worst selves, even when we try to change and work our way through the future.
Cindy: That's Ethan Zuckerman, a long time friend and tech pioneer. From stopping social media that surveils us to ending creepy advertising, Ethan has been having and collecting ideas about how to fix the internet for a while now. And he's going to help us imagine a better future with better tools. I'm Cindy Cohn.
Danny: And I'm Danny O'Brien. Welcome to How to Fix the Internet. A podcast of the Electronic Frontier Foundation.
Cindy: Ethan, you teach about tech and social change at the university of Massachusetts Amherst, and since you have your own podcast called Re-imagining the Internet, I am delighted that we are so in sync. So let's do both. Let's re-imagine the internet and fix it too while we're there. Welcome to the show.
Ethan Zuckerman: Oh, well thank you, Cindy and thanks, Danny. It's great to be with you and I'm so thrilled that we're both looking at this somewhat dark moment as an opportunity to actually make some changes and maybe not even fix, but really imagine ways in which things could be radically different. That's where I really want to push us.
Cindy: I certainly agree. I don't think you could build a better world unless you can imagine it, and so that has to be the first step. Let's start though, by mapping the territory a little bit. What are the things that we need to fix right now about the internet?
Ethan Zuckerman: Well, one of the things that's challenging about that question is that we don't know enough about what's wrong to even have a very good answer to it. There's a lot of hypotheses out there right now that social media in particular is bad for individuals, that it may be bad or corrosive for society. We have everything from evidence that platforms like Instagram are very bad for body image for young women, to questions of whether Facebook is dividing society broadly into camps that are very angry and cannot talk to one another.
One of the tricks with this is it's really hard to verify those hypotheses because they're all happening on closed platforms that can actually restrict our access to them. Most of these platforms won't give us an API to do real research and experiments on them. When we've tried to work with platforms like Facebook to actually do these studies, they've both dragged their feet and ultimately given people bad data.
Then when people have actually tried to go out and sort of scrape information and do studies on their own or use data donation, companies like Facebook have gone ahead and hit them with cease and desist orders. We have this moment of hypothesis that the internet, particularly social media, may be doing bad things to us. But, it's safe to assume at this point, that these platforms may not be doing for the world, what we and what they hope they might do.
Danny: It's super funny to me that, at EFF we spent many years fighting the NSA and the government over their mass surveillance programs, and one of the tensions you have there in the courts is that it is locked away and is claimed to be a state secret. And you have to depend on whistleblowers and now I find it terribly ironic that we are in the same situation with these big tech companies and everything has to depend on these whistle blowers, because there's no other way of finding out what's really going on.
Ethan Zuckerman: Well, let me suggest another irony Danny. When you and I, and Cindy first got involved with this work, we were really worried about state actors controlling speech. We were really worried about the Iranian government or the Chinese government censoring people. And of course, that still continues. Then we grew up a little bit and we started worrying about intermediary censorship. We worried about Facebook controlling speech. We worried about YouTube controlling speech. Then we got a little bit further, and then we worried about the intermediaries not controlling speech, right? That's the place we're at now. Now we worry about people censoring each other by being so abusive, that people are driven offline. We worry about people shaping the political conversation through mis and disinformation, which one of the big things that we're starting to learn about this, sometimes it persuades people to go down a QAnon rabbit hole. Sometimes it just persuades people just to get out of the space.
So we've gone from worrying about government censoring the net, to worrying about platform censoring the net, to now in some cases, worrying about platforms not doing enough to censor the net, this is not how we should be running a digital public sphere.
Cindy: This is where we get to re-imagining, right? Because it ends in censor more, censor less as your two choices. As a free speech person who especially worries about marginalized voices, neither of those is a good option. Right? Censor more, we know that that's going to disproportionately affect marginalized voices. Censor less, we have the problem that people can't tolerate being on these platforms anymore and that's not good either.
Ethan Zuckerman: Not only is it censor more or censor less, if we censor more, then the question is, is it the government who censors, or is it Facebook who censors?
Cindy: Yep.
Ethan Zuckerman: I'm not a big fan of the government censoring, but I'm not a big of Facebook censoring either. I don't particularly trust either of those entities to be involved with it. Once you have the critique, there's two things you can do. You can either try to fix it, or you can try to reimagine it. Let me give you two examples. If the problem is toxic speech online, a great fix for this is what Tracy Chou is doing with Block Party, right?
Danny: Right.
Ethan Zuckerman: She's built a whole company around adding features to Twitter that Twitter, frankly, should have. They let you do things like share block lists and block hundreds of people at a time. You can block people just because they liked or retweeted the hateful tweet about what a terrible person you are, because you can say, frankly, I don't want to engage with that person. I want to control my own space. That's a fix, right? And Twitter gets some credit. They maintain an ecosystem where she can come in as a third party developer and build that fix. Facebook won't let her do that. Facebook will go ahead and send the cease and desist if she tries to do that.
Reimagining looks at this and says, "Wait a second, why are we trying to have a conversation in a space that's linking 300 million people? Maybe this isn't where I want to have my conversation. And you know what? I don't actually want my conversation moderated by poorly paid people in the Philippines who are flipping through a three ring binder to figure out if speech is acceptable. What if we built social media around communities of people who want to interact with one another and want to take responsibility for governing those spaces?" Now that's huge and a very, very different way of looking at it.
Danny: So let's imagine that we could go back in time and grab the people who were hacking together Twitter, and we're able to show them all the consequences. Do you think it's possible to build a global social media system and have it work and have it not have everybody at each other's throats? Is it just a mistake that Twitter made, or is it a systemic mistake?
Ethan Zuckerman: First, I'm not sure you can build a global social media network and have it work. I think what you can do is have hundreds of thousands, or possibly even millions of social media spaces run by communities around the world and have many of them work. I think in many ways, that's a much better vision. Scale's hard. Having a set of speech rules that work for 10 people around a dinner table, that can be hard in it of itself. Everyone can think of a Christmas meal or a Thanksgiving meal with someone who's really politically out of line with everyone else and the Thanksgiving rules of no politics around the table. But that's 10 people. Once you start trying to scale that to India /Kashmir, or Palestinianians/ Israelis, or Rohingya/ Bama you're really wrestling with some questions that frankly, most of the people involved with these companies are not qualified to address.
The only solution I've been able to come to out of that is a whole lot of small spaces. All of us moving between them and some of us taking responsibility for governing, at least some of the spaces that we interact in.
Cindy: I remember when EFF first criticized Mark Zuckerberg about reneging on their privacy promises. Remember Facebook was going to be the privacy one compared to other things, and then they've just slowly rolled back all those promises. One of the things that we demanded was you need to give people a voice in this. What Facebook came up with is, well, that's okay, if a majority of Facebook users all around the world vote for a proposition, we'll do it. And of course, at scale, that's not the thing that sounds like democracy, but really isn't. I agree with you and I think there's a mix in here, because we both want things that are smaller in terms of the manageability of the size of it. But the other thing you said was really important is we want to be able to skip between them. We don't want to all be locked in small little walled gardens where we can't go from one place to another. Interoperability, interactions, protocols not platforms, all of these things that we think about and that we talk about on the network-y side of things become important in your world.
Danny: This has a feeling of nostalgia about it, because this is very much the idea and the feeling of the early internet. We had a lot of these small communities and did thrive and people could move between them. But then they withered away. Faced with Facebook, they weren't able to compete. Why do you think that was? Why do you think that there was a preference there and why do you think that that's not going to happen again?
Ethan Zuckerman: I think it has a lot to do with how capital markets have been structured. We've had three things happen simultaneously: we've moved from an open web where people start lots of small projects to one where it really feels like if you're not on a Facebook or a YouTube, you're not going to reach a billion users, and at that point, why is it worth doing this? Second, we've developed a financial model of surveillance capitalism, where the default model for all of these tools is we're going to collect as much information as we can about you and monetize your attention. Then we've developed a model for financing these, which is venture capital, where we basically say it is your job to grow as quickly as possible, to get to the point where you have a near monopoly on a space and you can charge monopoly rents.
Get rid of two aspects of that equation and things are quite different. If you're no longer trying to build things based around advertising. If you're building around subscription or God for fend, you're actually asking people to support something on a voluntary or, and now I'm getting into really dangerous territory here, as a public good, where you might use taxpayer money to say, "Actually, it's really important that we have a public space in which we can have a debate and dialogue about what happens in our community." If you start opening up those models, maybe you don't need venture capital anymore to do this.
There are sustainable long lasting projects out there that have built very different looking communities. They're not big. They're things like Pinboard. They're things like Arena. They're things like Front Porch Forum. They operate on a very, very different scale. People don't want to give them venture capital funds. The problem with them, Danny, is that when they need to do something technically innovative, when they need to do something really inventive, venture capital is a great way to get tens of millions of dollars without an immediate return. Part of what we need is some sort of mechanism, so we can do technological innovation at scale and then make it possible for people to run their own community at the cost of $5 a month, rather than at the cost of tens of billions of dollars a month.
Cindy: Are there places where you see that developing? I know you're trying to develop a version of this, but I suspect there might be others as well. But tell us about that.
Ethan Zuckerman: So this is a hard one as well. This issue has really come into focus for people over the last two years or so. I started writing about this idea of digital public infrastructure, this idea that maybe our public spaces should actually be paid for with public dollars about two years ago. In that two years, we've seen a new lab at MIT, a new lab at Harvard, the Civic Signals Project out of the University of Texas. We have the Liberty Project with 100 million dollars behind it to build a blockchain based new social media system with components in Georgetown and at Sciences Po in Paris. You've got interesting efforts coming out of public broadcasters in the UK and in the Netherlands. You have really interesting, what I would call the rise of the de-platformed, and in some ways that's happening in really great ways.
Assembly Four in Australia is a collective of sex workers who have built their own sex worker friendly Twitter called Switter, as an alternative to using American based tools after SESTA-FOSTA and they've gone on to build their own back page called Tryst and they're amazing. They're doing co-design with sex workers, sex workers around the company. But then there's the dark side of it. You have the folks with Gab who are trying to create their entire media ecosystem, including a Gab ad server and a whole component of Gab advertisers and Gab file storage because they've been de-platformed. There's lots of things out there, and there's the territory that I'm not quite sure how to deal with, which are a lot of the people who are trying to do this around the blockchain and around token based networks, which is a really fascinating mix of hope and grift. I'm trying to figure out how to navigate that complex space, perhaps more on the hopeful side than on the grift side.
Danny: One thing you said earlier is we're worried about a world where these big platforms censor, and we're worried about a world where these big platforms don't censor enough.
But, one of the things that people do say is, "Well, don't we need someone who is able to put their foot down and stop the hate speech that you might see on one of these services?"
How would they do that in this new model?
Ethan Zuckerman: I don't think that hateful speech disappears in the model that I'm talking about.
Will people build horrible hateful spaces within the architecture that I'm trying to design? Yeah, absolutely. My hope is that most of us will avoid them. My hope is that those spaces will be less successful in recruiting and pulling people into those spaces.
If you're in a community of a couple 100 people and someone starts posting anti-vax propaganda and you have house rules within your server, that you don't allow anti-vax propaganda, you have options. You can confront that person. You can remove the person from the community. You can put those posts behind the click through wall. You can do all sorts of things because your community is the one making those speech rules. By making speech rules about what a group of people choose, rather than trying to have them be universal, not just for the U.S. but for the world, you suddenly give yourself a great deal more ability to deal with this sort of hateful speech.
Danny: “How to Fix the Internet” is supported by The Alfred P. Sloan Foundation’s Program in Public Understanding of Science. Enriching people’s lives through a keener appreciation of our increasingly technological world and portraying the complex humanity of scientists, engineers, and mathematicians.
Cindy: I wanted to talk a little bit about advertising, because I think you have some ideas about how do we get out of the surveillance advertising world we're in now?
Ethan Zuckerman: I've started thinking about this question of there's a lot of values led organizations out there. Libraries or values led organizations. Nonprofits are values led organizations. Universities are values led organizations. What if those producers of content started thinking about different ways to be able to monetize attention? I don't know that there's anything actually wrong with advertising as a particular form of monetizing content. It's a pretty good way of trying to make money from content that people have not chosen to buy, or from a service that people have not chosen to buy. What's problematic about it for me is the surveillance economy that goes with it. I'm working with a brilliant young man, Chand Rajendra-Nicolucci on this idea of forgetful advertising and forgetful advertising would be a voluntary opt-in system aimed at those values driven organizations.
It would be able to do a lot of what advertising does right now, but in a single transaction. Danny, it could know what neighborhood you are in, in London, so that it could give you the right ad for Pret a Manger, but it can't know the next time it sees you, that that's where you had lunch last time. It can't maintain the profile that says, "well, he's the sort of person who looks at these websites and therefore I think I want to sell him this thing." It's a voluntary system. It involves different ad servers passing different key value pairs to figure out how to target an ad. Our guess is that it's probably less lucrative than the pure all out targeted advertising. But again, a big part of what I'm trying to do right now is explore what's possible if you're willing to consider things beyond pure market forces.
Cindy: How is this different from contextual advertising? We talked a lot about the difference between targeted surveillance advertising, that tracks you over time and contextual advertising, which is if I type I want shoes into Google, Google can show me things about shoes, because that's where I am right now. That’s what I am wanting.
Ethan Zuckerman: You don't want to get rid of that context. That context is very useful. By the way, it's probably the majority of Google's business. They actually make less off of the surveillance stuff than they do off of the intention based targeting. There's another form of contextual where you do an analysis of what webpage you're on. Gee, this person is reading about Facebook's metaverse, maybe I should target them a VR rig. That to me seems legitimate as well. That's the same thing as this person is reading the New Yorker, I bet they'd like my ad for a snooty fashion brand. Those things seem fine to me. What's not fine, is this data doppelgänger we're all carrying around here.
If we, as human beings are allowed to change and evolve, we have to find some way to be able to outgrow our data doppelgängers. It's not just that these things are creepy. It's that they're literally holding us to our worst selves, even when we try to change and work our way through the future.
Cindy: There are lots of ways in which people need to be able to grow and evolve, and also need to not be lumped in with a bunch of other people, who on some level, might look like them, but who aren't like them. This is how a lot of the bias and racial problems that we have and some of the targeting comes about
Ethan Zuckerman: The model that I ended up suggesting two years ago, I think no one's even bothered commenting on because it's so crazy and so improbable. But it's this idea of a tax on surveillance advertising that you use to create a fund that can create new technologies in this space. You look at that question of how do you get that venture like money to get something from 10 users, to the possibility of hundreds of thousands of users in other nodes? That's not easy. Any of us who've built that understand that that's very time consuming. It requires an enormous amount of engineering. Is there a way that we can actually imagine creating pools of capital that allow innovation, that doesn't have to scale to $10 billion, a billion users and cut every conceivable ethical corner along the way, because that's where the market that we've got right now seems to be taking us.
Cindy: I want to back up just a little bit, because I love this new world and this vision, but it wouldn't be an EFF podcast if we didn't talk a little bit about law and policy.
Ethan Zuckerman: Of course.
Cindy: I wanted to ask your view of what are the kinds of laws policies that we need to start pointing to, to help build this world? And of course, one that I'm sure we agree on is the Computer Fraud and Abuse Act and the role they plays in blocking research and innovation. I'd love to hear a little more about that, but what are the other ones that come to mind?
Ethan Zuckerman: Well, so very specifically on CFAA, what I have been pushing for is a safe harbor for researchers. And by researchers, I don't just mean academic researchers, I mean activists, I mean journalists, I mean citizen scientists. Anyone who wants to understand what these platforms are. I think we need a safe harbor that isn't based purely on your behavior. It thinks about your intentions and here's why. The same exercise that someone like Clearview AI went through to horrifically, unethically, breach people's privacy and create a truly frightening piece of technology, a very similar project could have been a responsibly done scraping of social media to answer some of these fundamental questions that we began the podcast with. Who's getting harmed? How would you mitigate those harms? You can't just look at the technical behavior. You have to look at the reasons behind it and you have to have this notion of public interest and a public interest exemption. That's not the only thing I'd love to do with CFAA. I'd love to wad the whole law up and throw it out there.
Cindy: We got some good news from the Supreme Court this last term with the Van Dam case and it gave us with a little bit of clarity about in terms of service violations and disloyal employees situations, which are fairly common. There’s also Sandvig v Barr, which protected researchers at the lower court level still but there’s so much more to do.
Ethan Zuckerman: Sandvig versus Barr is inspiring. It I hope is going to inspire a wave of professors trying to figure out how to take research projects and bring them in front of courts. I actually think that's a really terrific way to take advantage of having tenure and to take advantage of trying to get clarity around those things. One of the areas where I'd really love to see some work get done is adversarial interoperability. The vision that I have been talking about as far as social media, cannot work without adversarial interop. I have a tool called Gobo that we started building at MIT. We're now going through another generation of it at UMass. It's a social media browser. It's an aggregator. It pulls your social media together and it lets you sort it in different ways. If you want everything in straight chronology, you can have it. If you want it so that cute cats filter the top, you can get that too.
We can build that tool around Twitter, because they have a great API and they're pretty good about third party tools. We can build it around Mastodon. I even think we're going to be able to build it around YouTube, around Reddit, but we cannot touch Facebook. As soon as we do, they're going to send us a cease and desist and I don't think they're in their rights to do that. One of the strategies I'm really thinking about is how do we get a ruling on that? Is there a way to get a ruling on that where I don't have to spend five million dollars trying to figure out how to make my tool interact with Facebook? But I can actually get a court to say, "No. No, you should have a right to this. You should have a right to be able to use a third party service to interact with Facebook."
Cindy:
Let's talk Ethan.
Ethan Zuckerman: Yes.
Cindy: Because I want that as well. And of course EFF was deeply involved last time somebody tried to build an aggregator and got sued out of existence by Facebook. But the law is changing and let's talk. The first time a podcast launched a piece of impact litigation was right here.
Danny: This is where we go to mute for a bit and then we come back and it's two weeks later, but yeah.
Ethan Zuckerman: I have not yet retained Cindy's counsel. You never know what happens in the future.
Cindy: Oh, I didn't know you were going to show up and drop this catnip on me.
Ethan Zuckerman: I'm pleased that you would think of it as intriguing.
Cindy: Oh, God yes.
Ethan Zuckerman: I will say a lot of the debates that are going on right now, I don't find super interesting. I don't find the 230 debate particularly interesting, because I don't actually see a way through it that I feel particularly good about. Like I said, I can't really choose whether I would prefer having the government moderating Facebook or Facebook moderating Facebook. The answer is I would like users to be moderating users, but nothing within the 230 debate is putting that on the table. Where I'm focusing right now are on these quite technical arguments about being able to research the networks and then being able to interoperate with the networks in the hopes that we build this future space.
Danny: I think the idea is that reform of the CFAA would give you lots of opportunity, but there would also be this baseline so Facebook couldn't just say to you, "No, we're going to sue you from the get go." They would actually be obliged to give you some connectivity into their network.
Danny: Do you think that works?
Ethan Zuckerman: It's interesting. We saw the emergence of a consortium between people you would not think would be at the table together: Google, Facebook, Apple, Twitter, around exporting your data, so essentially leaving a service. That I think was done in good faith and in good spirit. The only problem with it is that when you leave Facebook, your data's really only useful if you're going to another Facebook. Leaving is not actually the interesting thing. It's being able to interoperate with each of the small blocks of content. You want to have that export pass for each thing that someone posts on Facebook or posts on Twitter, and you want to be able to interoperate with those. The danger in doing this poorly is that you end up with some minimum operability standard that's so high that new networks can't connect to it. It takes you so long to be able to interoperate with it. That's one of the things you have to fight against.
The other thing that you have to think about, and this is areas that I think none of us really like spending a lot of our time. I don't think you can responsibly run a social network these days, without some way of dealing with things like child sexual abuse imagery. Whichever lines you want to draw, maybe it's around terrorism. Certainly it's around child abuse imagery. There have to be some central resources where you can put up fingerprints of your images, instead of say, "I need to take this down."
Even in the circumstances that I'm talking about, there's no way to deal with a community that decides that child porn's a great thing and we're going to trade it back and forth, without having some of the central resources that you can work against. If you really were working for this decentralized world, some combination of the mandatory interop without too high level of it, and some sort of those collective resources that we in a field all work together on maintaining and feeding. With auditability, I understand that all those resources need to be audited and checked, so you don't end up being an irresponsible blacklist. Those feel like areas in which some really good work could get done.
Cindy: I really think that we need to broaden our perspective about the problem, not just to the part where we find it, but to the part where we actually prosecute the people who are sharing it. Right now our resources are really disproportionately aimed at the former and not enough at the latter. I agree with you, we have to think about these problems, but I also feel like sometimes because there's tech, we only focus on the tech part of the problem and I just don't want us to fall into that.
Ethan Zuckerman: No, you're absolutely right, it doesn’t solve the problem which is that people continue to abuse children and produce this imagery. That's a space where the tech solution alone doesn't get you there.
Cindy: So you know, we've talked a lot about this world, this better re-imagined world where we've got smaller spaces, but people can move between them. What are the kinds of values that we're going to get if we get to this world?
Ethan Zuckerman: I don't know if we're going to get this, but it's a hypothesis of mine. Robert Putnam and his maybe too famous book, Bowling Alone, mourns the death of democracy based on the fact that we don't join Elk's Lodges and we don't go bowling together anymore. You can read this as an overblown anti-internet book and we should all favor real world interaction, but here's another way to read it. Those organizations, those community driven organizations were where a lot of us learned how to do small “D” democracy. We learned how to run a meeting. We learned how to manage a budget. We learned how to host a group discussion. We learned how to get people to work together to do things. We've lost some of that.
I think one of the best ways that people are learning those small d democratic skills are doing things like being moderators on Reddit, are trying to figure out how to run virtual spaces that they actually have control over. One of the consequences that I believe might happen out of the networks that I'm talking about that are small, independent, federated, interlocking, but ultimately self-governing is that we might become better citizens and better neighbors. We might actually learn again, what those skills are about having a conversation with people that's constructive, rather than outsourcing the problem to poorly paid moderators in another country.
Ethan Zuckerman: I really do think that the answer is not just finding the next green field, it's finding a very different way to do it. I guess I've just gotten very sick of the conversations that start with, "Oh, good. Well first fix global capitalism." Well, yeah, yeah. Actually do fix global capitalism and maybe a simple start on that would be to say some part of the market can be public goods and not everything needs to be a commodity and not everything needs to be privately provisioned. It's interesting, that doesn't carry at all in the U.S. I can say it in academic circles, everyone's laughing it off. It's actually taken dead seriously in Europe. That's honestly for me, that's the most awkward thing about my life right now is that I would really like to be in the Netherlands about half the time at the moment, because it's where I see traction for this.
Danny: Ethan Zuckerman, we could talk all day, but we have to finish. Thank you so much for joining us.
Ethan Zuckerman: Oh, my gosh. What a pleasure. It's so nice to talk to both of you. I miss you both terribly and I'm really thrilled we could have this time to be together.
Cindy: Me too.
Cindy: Oh, it is just so fun to talk to Ethan. He is so thoughtful about all of these things and the enthusiasm he brings for the possibilities of a smaller future internet, it's really infectious.
We need to think about build smaller, more manageable places online, but also making sure that people can move between them.
Danny: I think that it's all a question of scale. I think he makes this great point that everything is ballooned up really because of the unique business model and venture capital funding model of the earlier internet. I love that analogy with dinner parties in that we can wrap our heads around how to host a dinner party that doesn't end in a flame war and sometimes it doesn't work. But the damage is limited and we learn and we get better at hosting dinner parties, or at least I hope I will.
Cindy: I think that one of the tensions here is if we go back to Larry Lessig's four levers of power, we haven't developed the kind of strong social norms online that will help people figure out how to behave and also to be accountable when they misbehave. People are reaching for markets and they're reaching for law and they're reaching for other ways to try to control this, on both sides. Right now, we're in a moment where both corporate censorship too much and corporate censorship not enough, are complaints that people have. I love how Ethan pulls this apart and helps us think about our history and where we are. But at the end of the day, humanity's never figured out who we should give this powerful censorship tool to, especially if it's going to reach across communities and across different societies. The best thing is to have communities that set up ways that they talk to each other, that everybody learns and can abide by, without somebody having to play the censor from on high.
Danny: Well, thanks for joining us on How to Fix the Internet. If you like what you hear, please tell a friend and follow our podcast. You can also follow Ethan's podcast. It's called Reimagine the Internet. And thanks to Nat Keefe and Reed Mathis of Beat Mower for making the music for this podcast. Additional music is used under a creative commons license from CCMixter. You can find the credits and links to the music in our episode notes.
Please visit eff.org/podcasts where you’ll find more episodes, learn about these issues, you can donate to become a member of EFF, as well as lots more. Members are the only reason we can do this work plus you can get cool stuff like an EFF hat, an EFF hoodie or an EFF camera cover for your laptop camera. How to fix the internet is supported by the Alfred P Sloan foundation’s program and public understanding of science and technology. I'm Danny O'Brien.
Cindy: And I'm Cindy Cohn. Thanks for listening.