Sandra (Sandy) Ordoñez is dedicated to protecting women being harassed online. Sandra is an experienced community engagement specialist, a proud NYC Latina resident of Sunset Park Brooklyn, and a recipient of Fundación Carolina’s Hispanic Leadership Award. She is also a long-time diversity and inclusion advocate, with extensive experience incubating and creating FLOSS and Internet Freedom community tools.
These commitments and principles drive Sandra’s work as the co-founder and Director of the Internet Freedom Festival (IFF) at Article19. Even before launching the Internet Freedom Festival, Sandra was helping to grow and diversify the global Internet Freedom community. As their inaugural Director of Community and Outreach, Sandra led the creation of Open Technology Fund’s (OTF) Community Lab. Before her time at OTF, Sandra was Head of Communications and Outreach at OpenITP where she supported the community behind FLOSS anti-surveillance and anti-censorship tools. She also served as the first Communications Director for the Wikimedia Foundation.
As a researcher Sandra has conducted over 400 expert interviews on the future of journalism, and conducted some of the first research on how Search Engine Optimization (SEO) reinforces stereotypes. She also provides consultation on privacy-respecting community marketing, community building, organizational communication, event management, program design, and digital strategy. All while serving on the board of the Open Technology Fund, Trollbusters, and Equality Labs.
In recent months Facebook, and others, have proposed the creation of oversight boards to set content moderation policies internationally. In the US, the fight to protect free expression has taken on a new urgency with Senators Graham and Blumenthal introducing the EARN IT Act. A bill that, if enacted, would erode critical free speech protections and create a government commission with the power to codify best practices, with criminal and civil liability on platforms that failed to meet them. With these committees in mind, I was eager to speak with Sandy about how these proposals would impact communities that are often the most directly affected, and the last consulted.
Nathan "nash" Sheard: What does free speech mean to you?
Oh, that's a good one. Free speech, to me, means the ability to share your thoughts, your analysis of things, your experience as a human being. Your experience can be anything from what you lived, to the challenges that you're facing, to your goals and hopes for the future.
The reason I'm wording it that way is because it really bothers me how free speech is being used recently. Hate speech, for me, is not free speech. I think that's why I'm phrasing it that way because I really think that the idea of free speech is to not censor people. To be able to express ideas and experiences and things like that. But it does not constitute an opportunity to basically hate against others or bring people down.
My partner is a philosophy professor, so I think of it in relation to critical thinking. I think of creating spaces that allow people to be honest and truthful, but towards improving society, not making it worse.
nash: What are your thoughts on responsible ways to navigate concerns around censorship and speech that is hateful or incites harm?
If I had that answer, I think I would be a billionaire right now. I think there's a very clear distinction, when folks are trying to debate, or share information, or when they're attacking another group of people. From my perspective, when speech incites violence or hatred against another group of people, that's not free speech.
Nowadays, because of the context, because of the situation that we're living in, ensuring that speech doesn't lead to violence is really important. I think that a lot of times, cultivating healthy communities, whether it's local advocacy, parents, professional or society in general, it requires not just having these debates about what exactly is free speech, but really about investing more resources and energy in creating tools that allow us to create safe spaces for folks. Once you have a safe space where people feel protected, and there's rules that each community is able to create for themselves, you know what's acceptable and not acceptable.
You can have free speech without hurting others rights, so I don't think it's a question of semantics. I think it's a question of shifting our culture to create safer spaces.
nash: What do you think about the control that corporations have over deciding what those parameters look like right now?
I think they're doing a horrible job. In fact, it gets me really angry because a lot of the corporations that are dominating these conversations have more resources than others. So, I think for them, really, they need to have a wakeup call. Communities have to start shifting resources into creating safe, healthy spaces. These corporations have to do that as well. It's kind of like diversity and inclusion, right? Corporations may have diversity and inclusion initiatives but that doesn’t mean they really cause change or really care. Same for other areas. It feels as though the safety of people and community health is always the last thing they consider.
So, I think that if they're going to be leaders. If they're creating these tools, or spaces, that they want so many people to use, they have a social responsibility to make sure that what they're offering is not going to harm society. That it's going to protect society. So, I think it's really about them readjusting where and how they're spending resources. Obviously, it's a complex question like it's a complex problem, but it's not impossible. In fact, it's very, very, possible. But, it requires intent and resources.
nash: Are there steps that we as folks engaged with technology the way we are—and in the technology space with the intent to empower communities and users—should be taking to help reclaim that power for the users and for communities, rather than leaving those decisions to be made within the silos of corporate power?
I mean, more and more people, rightfully so, are pushing for more community-owned infrastructures. Ideas on what that will look like are really diverse, and it's really going to depend on the community. Some folks will advocate for mesh networks, others will advocate for alternatives to Facebook and Twitter.
I really think it's up to the community to decide what that looks like. We need to start brainstorming along with these communities, and finding ways to shift how we've done tech. In the past, a lot of folks kind of had an idea, and then they started working on the idea, and then whoever used that tool, or not, that was the end of it. I think now we really have to—especially if we care about movements and the users right to privacy and security—we really need to start working more hand in hand with not just users, but specific communities, to basically help them and empower them with options of what they can use. And, also, empower them to make decisions for themselves. That's a really hard shift. I feel like in some ways we're going towards that direction, but it's going to take a kind of reprogramming of how we do things. There are so many things baked into that—power structures and how we relate to the world, and how others relate to us. I do think that investing more in brainstorming, in conjunction with communities, of what the possibilities are, is a really good first step.
nash: Some in the platform moderation community are looking to committees that would decide what is acceptable. We should obviously be exploring many different kinds of ideas. Still, I get concerned with the idea that a committee of folks who might exist and move around the world in one context will be making decisions that will affect folks in completely different contexts that they might not be able to relate with. Do you have thoughts on that strategy?
It's a really broad question, and it's hard because I think there are different situations that require different measures. But what I would say is 'localize, localize, localize'. If you have moderators that are looking over content, you have to make sure you have a variety of moderators from different cultural backgrounds, different languages, different socioeconomic backgrounds, so they really understand what's happening.
This problem requires a lot more investment, and understanding that the world is very diverse. You can't just solve it with one thing, so that means investing in the actual communities that you're serving. Using India as an example—India is a country that has multiple languages, multiple groups. That means that your moderation group, or your employees that you hire to do moderation, are probably going to have to be plentiful and just as diverse. The bigger tech companies may realize how much money actually is required to solve the problem, and may not feel ready to do that. But the problem is that while they're questioning what to do, or coming up with what they think is a simple solution to a very complex problem, it's going to impact more and more humans.
nash: I get concerned that we won't be able to move in that direction and not create a scenario where only Facebook or Twitter have the funds to execute these schemes effectively. That we'll set up guidelines that say you must do x or y, and then in doing so we inadvertently lock in Facebook and Twitter as the only platforms that can ever exist, because they're the only ones with the funds to comply effectively.
Like I said, it's a very complex problem. You know, when the Internet first started, it was almost like tons and tons of small communities everywhere that weren't necessarily connected to other folks around the world. Then we moved to this phase where we are now. On these large global platforms like Facebook and Twitter and Instagram where we're connected globally. I can find anybody in the world through Facebook. I think that we're going to start seeing a shift to more local groups again and mesh networks. Or, whatever that may be for your community.
I think a lot of these companies are going to see a decrease in users. Lots of people in my life that don't work in tech are ready. They're just overwhelmed with the use of these platforms because it really has impacted their real human interactions. They're looking for ways to really connect to the communities that they're part of. That doesn't mean that you won't be able to connect to folks you know across the globe, but communities can mean many different things. Communities can mean the people that live in your neighborhood, or it could be colleagues that are interested in the same topic that you're interested in.
The issue is that it's a more complex problem than just Facebook or Twitter, and honestly it really just requires rethinking how we are bringing people together. How are we creating safe spaces? How are we defining community? How do you bring them together? How do you make sure they're okay? It's a rethinking. The Internet's not that old. Right? And so it's not surprising that in like 2020, I think we're in 2020, that we're starting to reconfigure how we actually want that to impact our society.
nash: I really appreciate your thoughtfulness here. Do you have any final words you would like to offer?
This is really an important time for everybody to get involved in their communities. I just see how tired people are. And we really need to build more capacity. So, whatever people can do. If they're interested in supporting an open Internet where people are secure and protected they really really really need to start supporting the folks that are doing work, because people are really really tired and we need to build capacity, not just in existing communities but we have to build capacity where capacity doesn't exist. Going back to what you were saying before about platform accountability. Creating a group is not going to solve it. We need to invest in people and invest in people that can help us shift culture. That's it.
nash: thank you, so much.