top of page
  • Writer's pictureAJ SK

Are Google and Facebook trustworthy?

What is the emotion we are encountered with when we get to know that we are being ditched? It’s definitely not the best of the feelings. What would be your reaction when you get to know that your most used and the world’s most reputed applications aren’t worth your trust? Recently, two articles from the online forum 4Chan appeared briefly in Google’s “Top Stories” section after a search for the wrongly-named Las Vegas shooter. The article links, which appeared as two of the three top stories on search page of Google, turned out to be the result of a deep-web conspiracy theory that had taken hold of several 4Chan message boards.

Google, in a Twitter post, apologised for amplifying the misinformation, promising to modify “algorithmic improvements” to its news filters in order to avoid a similar situation in the future. The incident came on the heels of a similar issue on Facebook. Last Monday, the social media site’s “Trending Topics” page was returning articles from the Kremlin-sponsored media outlet Sputnik News. As with Google, Facebook too soon apologised for the issue and took down the pages. During a time wherein individuals progressively go first to monster media locales for data, particularly after a fierce episode, for example, the Las Vegas shooting, do organisations have a moral commitment to vetting the sources that show up on their destinations?

John Basl, associate teacher of Philosophy at Northeastern University, when asked if these big media companies like Google and Facebook have an ethical duty to root out misinformation on their sites opined that “They do have an obligation to do something about misinformation, but I’m not sure they should root it out if that means removing access to it or hiding results that are judged to be misinformation. That’s not only a difficult technical task—how do we ensure, for example, that satirical pieces are not hidden or removed as misinformation—but, depending on how it is done, might compromise important values such as transparency and openness”.

He is concerned to decode a better solution to identify or flag the reliability of questionable search results or links.

Harminder Singh

0 views0 comments
bottom of page