You probably already know that Facebook’s algorithms determine what you’ll see on your newsfeed. But, are these algorithms a neutral force that simply tally the votes of what’s most popular, or are they – like its creators – biased?
An interesting thing happened recently. Several former Facebook workers claimed that they were instructed to artificially add selected stories into the trending news module, even if the algorithm that scans Facebooks’s billions of conversations didn’t propose them – in other words – even when the news stories weren’t trending. They also said that Facebook routinely suppressed news of interest to conservative readers.
Facebook, of course, denied any suggestion of political bias.
So what can this tell us? Algorithms should be driven by incorruptible data and therefore neutral and trustworthy indeed. However, even when algorithms do their jobs fair and square, they still reflect human bias simply because they were developed by humans.
In the future, we will rely even more on algorithms because they will, as the rest of technology, get smarter and smarter. This is normal and something we should be excited about, but it’s worth noting who their developers are.