How to fix "fake news" for good? Web scrapping!

20 Dec 2016

We live in a world where we consume bite sized news and believe everything on the Internet that is “trending” and widely shared. Fake news (can we still call it news then?)  has been a hot topic globally in the past few months with notably Facebook’s scandals making the headlines. This major issue is impacting the business world but also now politics.  
So how can we discern fake to real news when even the most complex algorithms built by the biggest tech companies in the world aren’t capable of? Maybe a good first initiative would be to simply verify the so-called fake news and found out if it is accurate or not. Journalists have been doing this since the profession existed! You verify your sources, double-check, triple check and more if needed. This should be easier, more accurate and even cheaper with available technologies today.!
Four students from Princeton in a 36 hour hackathon came up with a "news feed authenticity checker" solution which in short:

  • classifies every post, be it pictures (Twitter snapshots), adult content pictures, fake links, malware links, fake news links as verified or non-verified using artificial intelligence
  • for links, they consider the website’s reputation, also query it against malware and phishing websites database and take the content, search it on Google/Bing, retrieve searches with high confidence and summarize that link and show to the user
  • for pictures like Twitter snapshots, we convert the image to text, use the usernames mentioned in the tweet, to get all tweets of the user and check if current tweet was ever posted by the user.”

In others words, this is simply web-scraping! At InSwiGo, we offer web-scraping services to our clients for various purposes. If you would like to discuss this topic with us or are interested in web-scraping tools for your company, please contact us here