We need to talk about the internet

By the year 2020, over 20 billion devices will be connected to the Internet, more than seven devices per person, most of them mobile. When we didn’t all have tiny supercomputers in our pockets, it felt a lot easier to draw the line between “irl” and url.  Tomorrow, it’s not clear if that line will even exist.

Over the course of the past year for the IRL podcast, I’ve interviewed over 50 people from across the internet and around the world, with the hope of better understanding what it means to make a life here, online, and in between realities. Before we started our first season, it was almost an assumption that there was a limit to how “real” the internet could get. But now, the issues changing the internet are the issues changing the country, and the world. We don’t have the luxury of not paying attention.

People: we need to talk about the internet. That’s exactly why we started IRL; to push this conversation forward. And I’m encouraged by the number of people — nearly 2 million so far — who’ve tuned in to IRL to listen and share their perspective on life online.  It’s hard to predict what the next year (or even the next day) holds for the internet, but looking back on our most downloaded episodes of 2018 feels like a good place to start.

We need to talk about information integrity.

The truth is out there. But online, it feels like it’s harder than ever to find. Consider this sobering statistic from a recent MIT study: on Twitter, lies are 70 percent more likely to be retweeted than facts. Somehow, the information age became the disinformation age. Where do we go from here? In IRL Episode 14, Mozilla Fellow of Renee DiResta of Data For Democracy breaks down the issue. “It’s not about “the truth”, anymore. It’s about information integrity.

”We, for some reason, are abdicating the idea that it is even possible to ascertain what is true without YouTube saying, ‘Well, is the Earth flat? I don’t know. Maybe, who are we to tell them what to think ‘ There needs to be a push back against this just because you can’t have a society that functions when the base assumption is that everything is fraud.”

Platforms need to start taking responsibility for information integrity. So do we, each one of us.  Being wired together isn’t enough. We need human connections, philosophical connections and intellectual connections on top of our fiber optic cables, wifi signals and server rooms.

We need to talk about algorithmic bias.

Algorithms may be invisible. But when they misfire, the damage can be irreparable, particularly for the internet’s most vulnerable populations. In IRL Episode 12, we investigate the way that algorithmic bias can marginalize minority communities, and even endanger young YouTube users with flawed, and graphic, video recommendations. According to researcher Safiya Noble, we need to start rethinking algorithms.

We think of algorithms as objective mathematical formulas. And yet, math, computer science, computer programming, these are languages to express ideas and concepts, and as we know language is highly subjective. And so computer language also is subjective, based on who’s writing, who’s coding, and how they want to express or point to particular ideas and concepts.”

When algorithms fail, they reveal something else besides intent: our own blind spots and biases. Now, it’s up to us to push for accountability. Because when bad code spreads disinformation and bias, it’s never something that “the algorithm did.” It’s something people did. That means we can still take charge of our algorithms. After all, they’re here to serve us, and not the other way around.

We need to talk about privacy.

For decades, the internet has come with its own cloak of invisibility. To quote the famous New Yorker cartoon, and one of the longest running jokes online: “On the Internet, nobody knows you’re a dog.”  Every day, in small and sometimes extraordinary ways, we benefit from the ability to choose our identities online; to share and explore things we might not be able to share and explore in public.  We can reveal ourselves online. But it’s not always by choice, as countless users found out when they downloaded their personal data, including unsent messages and videos, from Facebook. As Tor Project’s Alison Macrina said in IRL Episode 11:

“Privacy is an essential human right. And the ability to use the internet privately is even more important than the way we’ve thought about privacy in the past, because there are so many ways that we are being surveilled and tracked when we use the internet. Too many to even know.”

A huge portion of our identity is formed in the internet’s public arena. We’ve grown so used to being monitored that we sometimes forget that there’s another self, a totally private one, that’s worth protecting. There was a dream when the internet was born, of a cartoon dog who could be anything he wanted to be when he barked online. That’s a dream worth protecting.

So, let’s talk more.

Season 3 of the IRL podcast returns this summer. I’ll see you online, until we catch up again… IRL.

Oh, and if you’ve enjoyed the IRL podcast and think others can benefit from tuning in, take a moment and vote for IRL in the Webby Awards!


Share on Twitter