After all these years, I’m still not on Facebook. There are plenty of reasons for that, not the least of which is the company’s seeming inability to keep its fucking hands out of people’s timelines. Say what you want about Twitter, but for whatever faults it may have, it does get that right. You choose who you would like to see tweets from, those people tweet things, those things appear in order. It’s a completely reasonable adult way to treat customers, even if some of those customers have trouble with the whole reasonable adult thing. But Facebook just can’t help itself. It has to screw about with this and that, trying to solve the world’s problems regardless of whether or not those problems actually exist.
It’s latest crusade is against fake news, which to be fair is a problem that does exist and is one that Facebook had a not so small hand in making worse, so it does arguably have a role to play in shutting it down. But the route it appears to be taking to that end according to a recent issue of Broadcast Dialogue is a pretty foolish and ultimately fruitless one, I think.
Facebook began tests in the U.S. this week prioritizing news from “publications that the community rates as trustworthy,” as part of an ongoing effort to reduce fake news and clickbait. Users can still decide which stories appear at the top of News Feed with the See First feature. For publishers, the move means publications deemed trustworthy by users may see an increase in their distribution, while publications that do not score highly could see a decrease.
What could possibly go wrong, right? I mean it’s not like anyone could ever find a way to game that system through bots or well coordinated campaigns. And the United States, naturally, is the perfect place to test this. It is, more than any other country on Earth, the very definition of a united front. Its government is as honest as the day is long and would never think of starting silly wars with the media, which in turn would never dream of bending facts or outright making shit up in order to push a corporate overlord’s preferred narrative.
Anybody with a clue should be able to figure out how this is going to end. When you make the news into a popularity contest painted up as some sort of online democracy, the only thing that’s going to suffer is actual democracy. People are going to go on believing and spreading whatever they like, facts be damned. It was that way before there was a Facebook. Facebook just made it easier. Sure it would be nice if Facebook could find a way to make it harder, but doing that is going to require some very tough decisions that a lot of people aren’t going to like and that may hurt Facebook’s bottom line, at least in the short-term. There will need to be fact checkers and ground rules and ways to distinguish satire from lies. Do Facebook’s users have a spot in this equation? Of course they do. But it’s so much more than just clicking true or false on a never ending multiple choice test. It’s all about media literacy and critical thinking, and try as it might, Facebook can neither teach those things or instill them into people at will.