As more people than ever go online for news, chances are they've encountered a problem with social media and search websites — whether they know it or not. The machines that decide what news we see online still aren't reliable.
Algorithms determine the articles in your Facebook feed or your Google results. The exact inner workings of these computer programs are trade secrets. But they all decide what news to show you based on variables; those could include your interests, who posted an article and even the reach of the news outlet behind the story.
And they don't always work. After the Las Vegas shooting, Google's news algorithm briefly highlighted a 4chan hoax, which misidentified the shooter. Google says its system saw a flurry of search activity for someone's name and followed its programming.
Other times, algorithms work too well. Facebook will show you content you already agree with because you'll likely click. Researchers can now measure the echo chamber it creates.
And Congress is now leading a broad investigation into if and how Russia abused automatic, loosely monitored ad algorithms on Facebook and other sites to influence the 2016 U.S. election.
After these cases, some claim Facebook and Google should take more responsibility for their code — and make it less of black box. And because those companies have such wide influence, U.S. officials want to pry deeper into their inner workings.
Until algorithms can replace the nuance of human judgment, we probably want to keep humans involved. After the investigation into Russian ads, Facebook seems to agree. The company is hiring more people to screen its advertising.