Twitter and Facebook: unfck the algorithms

Our socially distant reality is pretty damn weird, let’s be honest. Social networks shouldn’t make it any weirder — or more dangerous.

And yet they are making it more dangerous while promising to “bring the world closer together.” Extremists are finding each other in Facebook groups to plan insurrections and other not-very-good-for-civic life things. Facebook has to do better.

Over on Twitter, bots and organized mobs have all-too-easily hijacked trends to spread dangerous misinformation and hate speech. Like this and this. Twitter too has to do better.

Safe to say if you’re reading this that people who spread misinformation and hate aren’t your favorites. That’s why we need to unfck Twitter and Facebook now, before they warp civic life any further.

Can we do it? We have to. Social networks have shown they will respond if we apply enough heat. They have taken action to deal with the problems they’ve helped to cause, but too often too little, too late.

So let’s unfck some of the algorithms that use data to divide us and drive hate. We can start with the actions below, at least until the election is complete and certified. Here is what you can do:

For Twitter, head over here to tweet @CEO Jack Dorsey that trending topics need to stop for now. Just click and share. With any luck, they’ll be so many of us that we will trend ourselves. And the Mozilla Foundation is boosting the campaign with a full page ad in the 10/20 edition of the Washington Post. Check it out.

For Facebook, go here to tell Mark Zuckberg to stop recommending dangerous groups. You’d think he’d get the picture after so many groups have caused so many problems. But alas, some people just need to be told something over and over before they’ll act. If you want to go a step further with Facebook you can install Facebook Container which helps prevent them from tracking you around the web.

Want to know more?

Trending Topics

Twitter helped to create an additional meaning for viral. Its new meaning meant something that is so popular it was contagious across the service. Everybody felt that they had to share it.

Twitter captured the zeitgeist through Trending Topics, but this feature has caught the attention of misinformation experts as one mechanism by which misinformation can go viral before the fact checkers even notice.

Trending topics are “tailored for you based on who you follow, your interests, and your location,” according to Twitter’s FAQ.  But hashtags associated with conspiracy theory have made the lists. So has state propaganda and Covid-19 conspiracies.

The problem is so rampant, even Sacha Baron Cohen felt the need to weigh in: “Twitter is a super-spreader of lies and hate,” the comedian and filmmaker tweeted.

Twitter does appear to be trying. It has responded with an announcement that it will be adding context around trends – context that relies on a combination of algorithmic input and its curation team to decide which tweets are reflective of the trend. Additional descriptions written by its curation team promise to be clearly sourced, but that is unlikely to limit the spread of misinformation from trending topics.

What’s trending on Twitter shouldn’t be able to distort the U.S. political debate. Twitter must take action.

Facebook group recommendations

Facebook is advertising the value of its groups, which allow people to join small intimate conversations organized around a topic or cause. What sounded good in theory is now becoming a breeding ground for conspiracies and even violence.

Automated recommendation systems have suggested groups supporting QAnon and other conspiracies. They are also where several people planning violence have met and coordinated, including the group now under investigation for planning to kidnap the Michigan governor.

Facebook has stopped recommending QAnon groups, but this problem is well known to them. It has known about this problem for years while extremism grew on the platform. In fact, the company began heavily promoting Groups for the last several years even though in 2016, when researchers presented evidence to the company showing that 64% of all extremist group joins are due to [Facebook’s] recommendation tools…” in other words “[Facebook’s] recommendation systems grow the problem.”

Facebook groups can’t be a breeding ground for extremism, especially this election season when the temperature is already at boiling. This is why Mozilla is demanding that Facebook act now.

Check out more ways you can unfck the system

Share on Twitter