Dr. J. Nathan Matias on leading technology research for better digital rights

At Mozilla, we know we can’t create a better future alone, that is why each year we will be highlighting the work of 25 digital leaders using technology to amplify voices, effect change, and build new technologies globally through our Rise 25 Awards. These storytellers, innovators, activists, advocates. builders and artists are helping make the internet more diverse, ethical, responsible and inclusive.

This week, we chatted with winner Dr. J. Nathan Matias, a professor at Cornell University leading technology research to create change and impact digital rights. He leads the school’s Citizen and Technology Lab (CAT) and is the co-founder of the Coalition for Independent Technology Research, a nonprofit defending the right to ethically study the impact of tech on society. We talk with Matias about his start in citizen science, his work advocating for researchers’ rights and more.

As a professor at Cornell, how would you gauge where students and Gen Z are at in terms of knowing the dangers of the internet?

As a researcher, I am very aware that my students are one narrow slice of Americans. I teach communication and technology. I teach this 500 student class and I think the students I teach hear about people’s concerns, about technology, through media, through what they see online. And they’re really curious about what if that is true and what we can do about it. That’s one of the great joys of being a professor, that I can introduce students to what we know, thanks to research and to all the advocacy and journalism, and also to what we don’t know and encourage students to help create the answers for themselves, their communities and future generations.

To kind of go a little bit even further, as a professor, what are the things that you try to instill with them, or what are core concepts that you think are really important for them to know and try to hammer down to them about the internet and the social impacts of all of these platforms?

If I’m known for one thing, it’s the idea that knowledge and power about digital technologies shouldn’t be constrained to just within the walls of the universities and tech companies. Throughout my classes and throughout my work, I actively collaborate with and engage with the general public to understand what people’s fears are to collect evidence and to inform accountability. And so, my students had the opportunity to see how that works and participate in it themselves. And I think that’s especially important, because yeah, people come to a university to learn and grow and learn from what scholars have said before, but also, if we come out of our degrees without an appreciation for the deeply held knowledge that people have outside of universities, I think that’s a missed opportunity. 

Beyond the data you collect in your field, what other types of data collection out there creates change and inspires you to continue the work that you do?

I’m often inspired by people who do environmental citizen science because many of them live in context. We all live in contexts where our lives and our health and our futures are shaped by systems and infrastructures that are invisible, and that we might not appear to have much power over, right? It could be air or water, or any number of other environmental issues. And it’s similar for our digital environments. I’m often inspired by people who do work for data collection and advocacy and science on the environment when thinking about what we could do for our digital worlds. Last summer, I spent a week with a friend traveling throughout the California Central Valley, talking to educators, activists, organizers and farmworkers and communities working to understand and use data to improve their physical environment. We spent a day with Cesar Aguirre at the Central California Justice Network. You have neighborhoods in central California that are surrounded by oil wells and people are affected by the pollution that comes out of those wells — some of them have long been abandoned and are just leaking. And it’s hard to convince people sometimes that you’re experiencing a problem and to document the problem in a way that can get things to change. Cesar talked about ways that people used air sensors and told their stories and created media and worked in their local council and at a state level to document the health impacts of these oil wells and actually get laws changed at the state level to improve safety across the state. Whenever I encounter a story like that, whether it’s people in Central California or folks documenting oil spills in Louisiana or people just around the corner from Cornell — indigenous groups advocating for safe water and water rights in Onondaga Lake — I’m inspired by the work that people have to do and do to make their concerns and experiences legible to powerful institutions to create change. Sometimes it’s through the courts, sometimes it’s through basic science that finds new solutions. Sometimes it’s mutual aid, and often at the heart of these efforts, is some creative work to collect and share data that makes a difference.

Dr.J Nathan Matias at Mozilla’s Rise25 award ceremony in October 2023.

When it pertains to citizen science and the work that you do, what do you think is the biggest challenge you and other researchers face? And by that I mean, is it kind of the inaction of tech companies and a lot of these institutions? Or is it maybe just the very cold online climate of the world today?

It’s always hard to point to one. I think the largest one is just that we have a lot more work to do to help people realize that they can participate in documenting problems and imagining solutions. We’re so used to the idea that tech companies will take care of things for us, that when things go wrong, we might complain, but we don’t necessarily know how to organize or what to do next. And I think there’s a lot that we as people who are involved in these issues and more involved in them can do to make people aware and create pathways — and I know Mozilla has done a lot of work around awareness raising. Beyond that, we’ve kind of reached a point where I wish companies were indifferent, but the reality is that they’re actively working to hinder independent research and accountability. If you talk to anyone who’s behind the Coalition for Independent Tech Research, I think we would all say we kind of wish it we didn’t have to create it, because spending years building a network to support and defend researchers when they come under attack by governments or tech companies for accountability and transparency work for actually trying to solve problems, like, that’s not how you prefer to spend your time. But, I think that on the whole, the more people realize that we can do something, and that our perspective and experience matters, and that it can be part of the solution, the better off we are with our ability to document issues and imagine a better future. And as a result, when it involves organizing in the face of opposition, the more people we’ll have on that journey

Just looking at this year in general with so much going on, what do you think is the biggest challenge that we face this year and in the world? How do we combat it?

Here’s the one I’ve been thinking about. Wherever you live, we don’t live in a world where a person who has experienced a very real harm from a digital technology — whether it’s social media or some kind of AI system — can record that information and seek some kind of redress, or even know who to turn to, to address or fix the problem or harm. And we see this problem in so many levels, right? If someone’s worried about discrimination from an algorithm in hiring, who do you turn to? If you’re worried about the performance of your self-driving car, or you have a concern about mental health and social media this year? We haven’t had those cases in court yet. We’re seeing some efforts by governments to create standards and we’re seeing new laws proposed. But it’s still not possible, right? If you get a jar of food from the supermarket that has harmful bacteria, we kind of know what to do. There’s a way you can report it, and that problem can be solved for lots of people. But that doesn’t yet exist in these spaces. My hope for 2024 is that on whatever issue people are worried about or focused on, we’ll be able to make some progress towards knowing how to create those pathways. Whether it’s going to be work so that courts know how to make sense of evidence about digital technologies —and I think they’re going to be some big debates there — whether it’s going to involve these standards conversations that are happening in Europe and the U.S., around how to report AI incidents and how to determine whether an AI system is safe or not, or safe for certain purposes and any number of other issues. Will that happen and be solved this year? No, it’s a longer term effort. But how could we possibly say that we have a tech ecosystem that respects people’s rights and treats them well and is safe if we don’t even have basic ways for people to be heard when things go wrong, whether it’s by courts or companies, or elsewhere. And so I think that’s the big question that I’m thinking about both in our citizen science work and our broader policy work at Cat Lab.

There’s also a bigger problem that so many of these apps and platforms are very much dependent upon us having to doing something compared to them. 

Absolutely. I think a lot of people have lost trust in companies to do things about those reports. Because companies have a history of ignoring them. In fact, my very first community participatory science project in this space, which started back in 2014, we pulled information from hundreds of women who faced online harassment. And we looked at the kinds of things they experienced. And then whether Twitter back then was responding to people’s reports. It revealed a bunch of systemic problems and how the company has handled it. I think we’ve reached the point where there’s some value in that reporting, and sometimes for good and sometimes those things are exploited for censorship purposes as well — people report things they disagree with to try to get it taken down. But even more deeply, those reports don’t get at the deeper systemic issues. They don’t address how to prevent problems in the first place, or how to create or how to change the underlying logics of those platforms, or how to incentivize companies differently, so that they don’t create the conditions for those problems in the first place. I think we’re all looking for what are the right entities? Some currently exist, some we’re going to have to create that will be able to take on what people experience and actually create change that matters.

We started Rise25 to celebrate Mozilla’s 25th anniversary, what do you hope people are celebrating in the next 25 years?

I love that question because my first true encounter with Mozilla would have been in 2012 at the Mozilla festival, and I was so inspired to be surrounded by a room of people who cared about making the Internet and our digital worlds better for people. And it was such a powerful statement that Mozilla convened people. Other tech institutions have these big events where the CEO stands on a stage and tells everyone why what they’re doing is revolutionary. And Mozilla did something radically different, which was to create a community and a space for people to envision the future together. I don’t know what the tech innovations or questions are going to be 25 years from now — there will probably be some enduring ones about access and equity and inclusion and safety for whatever the technologies are. My hope is that 25 years from now, Mozilla will continue to be an organization and a movement that listens and amplifies and supports a broad and diverse community to envision that together. It’s one of the things that makes Mozilla so special, and I think is one of the things that makes it so powerful.

What is one action you think that everybody can take to make the world and their lives online better?

I think the action to believe yourself when you notice something unusual, or have a question. And then to find other people who can corroborate and build a collective picture. Whether it’s by participating in the study at Cat Lab or something else. I have a respiratory disability, and it’s so easy to doubt your own experience and so hard to convince other people sometimes that what you’re experiencing is real. And so I think the biggest step we can do is to believe ourselves and to like, believe others when they talk about things they’ve experienced and are worried about but use that experience as the beginning of something larger, because it can be so powerful, and make such a huge difference when people believe in each other and take each other seriously.

What gives you hope about the future of our world?

So many things. I think every time I meet someone who is making things work under whatever circumstances they have — unsurprising as someone who does citizen and community science. I think about our conversations with Jasmine Walker, who is a community organizer who organizes these large spaces for Black communities online and has been doing it for ages and across many versions of technology and eras of time. And just to see the care and commitment that people have to their communities and families as it relates to technology — it could be our collaborators who are investigating hiring algorithms or communities we’ve talked to. We did a study that involved understanding the impact of smartphone design on people’s time use, and we met a bunch of people who are colorblind and advocates for accessibility. In each of those cases, there are people who care deeply about those around them and so much that they’re willing to do science to make a difference. I’m always inspired when we talk, and we find ways to support the work that they’re doing by creating evidence together that could make a difference. As scientists and researchers, we are sometimes along for the ride for just part of the journey. And so I’m always inspired when I see the commitment and dedication people have for a better world.

Get Firefox

Get the browser that protects what’s important

Share on Twitter