We talk about the filter bubbles on social networks—those algorithms that keep us connected to the people we feel comfortable with and the world we want to see—and their negative impacts, but real-world filter bubbles, like the one in Silicon Valley, are perhaps more problematic. People become numbers, algorithms become the rules, and reality becomes what the data says. Facebook as a company makes these bubble blunders again and again. Its response to the ruckus over fake news is a perfect illustration of the missing empathy gene in Silicon Valley. Mark Zuckerberg, one of the smartest and brightest founders and chief executives of the post-Internet era, initially took a stance that Facebook can’t really play arbiter of what is real and what is fake news. It took a whole week for the company to acknowledge that it can build better tools that help fight the scourge of fake news and yet stay neutral.
One of the best articles I read in the last month.