I would recommend that Facebook simply add the ability to add tags to messages, pages, and sites. For example, a ‘politics’ tag that one could use to filter out political messages. People could define a fake news tag. Clicking on a small tag icon on a post would expose a tag cloud of the most popular tags associated with the post. It could be self-policing - the most popular tags floating to the top and being more relevant in filters or searches.
I’d use the shit out of this. Foodie posts, cat photos, etc. And filtering out political sites, and news sites (fake and real, for that matter). I’d kinda like my Facebook feed to be about my friends and family, but not about their choice in games.
I know enough about the history of media to know that this isn’t a new problem. Civil War era newspapers weren’t exactly subtle about their bias, and Hearst was never shy about warping reality to suit his political aims.
What I don’t know enough history about is how we got out of that situation. Mainstream media was never perfect, but there does seem to have been a high point of comparatively decent journalistc standards in the 50’s-70’s; Murrow, Watergate, etc.
We’ve got some historians here; anyone got the detail of how that period of semi-objective mostly-accurate journalism came about?
The problem with your method is that most people who know how to do that, or are capable of learning how to do that, are already doing it. The average person on Facebook today is not internet/technology fluent. They have a cell phone. It has Facebook. They use it. They don’t know how any of it works, and they don’t want to know. They can’t code, and have no interest in learning. They don’t know RSS from CSS, and have no interest in learning it. And the only apps they have installed are the ones their phone came with, and a few games, and a few apps that had ads in those games, many of which are malware. Facebook is basically a fancy tv for them. And I’m absolutely NOT saying these people are stupid. They just aren’t tech geeks, and their interests lie elsewhere. If Facebook and Google don’t do something to deal with the proliferation of fake news, these people will continue to believe any story they see that confirms biases they already have, regardless of their political leanings.
That being said, I have misgivings about Facebook and Google becoming editorial boards. I know that, to a certain extent, they already are, but this would be elevating their editorial function to a whole new level. Unless they can figure out a way to make it unprofitable to create fake news in the first place, Facebook’s and Google’s own biases will be reflected in the results, and I think that gives them too much power. And I haven’t seen anyone address the issue of legitimate fake news yet. The Onion produces fake news, and sometimes their stories have gone viral among certain circles who believed them to be true. The Onion is true satire, and I don’t think anyone wants them to disappear from our news feeds. Other sites which produce satire aren’t quite as good at doing it, however, and while their intent is to produce genuine satire, it comes across as just fake news. There’s a spectrum here from well crafted satire on one end to insidious fake news intended to misinform on the other end. If you’re going to filter out the fake news, someone is going to have to decide where the line between fake news and satire is, and it’s a big fat blurry line.
So basically vote (on FB) on news so that you decrease the likelihood of them influencing another vote (on presidential elections)?
Those vote systems are prone to manipulation. Plus in order to get a robust result you would need to expose enough people to the news which kinda defeats the purpose. People are not likely to vote based on truthfulness of news (which they usually don’t have a way of knowing) but based on wether they like the news. You are likely to get a 50/50 split or close to it.
Speaking of which: if you don’t think that Zuckerberg will monetise the control of political propaganda, you don’t know Zuckerberg.
He’s got nearly absolute control of the majority of the electorates’ news, from both mainstream and alternate sources, as well as strong influence over their perception of the views of their friends and family.
If he wants to swing an election, he can. If he wants to sell an election to the highest bidder, he can do that too. And not just in the USA.
This whole idea is based on the assumption that the result of the vote was somehow changed as a result of fake news influencing the opinion of significant amount of people.
This assumes also that people exposed to absolute truth would always make a dispassionate rational decision.
Unfortunately this is not so. Especially when it comes to populist politics. What you just had in US was one very passionate round of elections. Too many people will perfectly well know that their favourite candidate is not telling the truth or distorting it and yet would cast their vote just simply because they hate the other guy more. People believe what they want to believe. Populists usually win by riding on some sort of protest vote.
My exposure to this sort of crap on Facebook was a primary reason I was not shocked by the election results. For a person like me at least, my being “friends” with high school classmates and distant family members with views completely unlike my own is one of my few exposures to that belief ecosystem. At least until the election results hit, I might have been happier to have blocked that stuff out, but at least I could see what was about to hit.
It also lines up with the period of the highest voter turnout. I imagine the political reporting was so good because politics were much more important to people at the time.
As a product quality professional with 15 years in the field, I’ll just point out that Facebook as an organization is not incentivized to make the right decisions about what is fake news and what is real.
In fact they are incentivized to make decisions based on profits, external business or government relationships, and a notion of “fairness” that in the best case will be centered around retaining the maximum number of users, and not around the objective truth.
With strong support from company leadership this can, at best, be a temporary “band-aid” fix, but without fixing the incentives in some robust way, it will inevitably collapse.
I predict this will be a truly epic failure, and the people who will be most angry at the results will be exactly the people who are currently outraged that Facebook didn’t implement it sooner.
Bill Ford has also publically complained about Trump continuously misrepresenting the move to Mexico, but based on the PR response to this he’s not going to keep arguing.