Reflection #6 – [09/13] – [Deepika Rama Subramanian]

  1. Sandvig, Christian, et al. “Auditing algorithms: Research methods for detecting discrimination on internet platforms.”
  2. Hannak, Aniko, et al. “Measuring personalization of web search.”
  3. These papers deal with the algorithms that, in some sense, govern our lives. Sandvig et al. talk about how biases are programmed into algorithms for the benefit of the algorithm’s owner. Even organizations that wish to keep their algorithms open to public cannot do so in its entirety because of miscreants. The authors speak about the various ways the algorithms can be audited – code audit, non-invasive user audit, scraping audit, sock puppet audit, and collaborative crowdsourced audit. Each of these methods had their upsides and downsides, the worst of them being legal issues. It does seem like good PR to have an external auditor come by and audit your algorithm for biases. O’Neil Risk Consulting & Algorithmic Auditing calls their logo the ‘organic’ sticker for algorithms.

    While I wonder why larger tech giants tend not to do this, I realize we are already severely reliant on their services. For example, Google search for online shopping redirects us through one of Google Ad services to a website, or maybe we search for a location and we have it on Maps and News. They’ve made themselves almost indispensable to our lives that the average user doesn’t seem to mind that their algorithms may be biased towards their own services. Google was also recently slapped with a 5 billion dollar fine by the European Union for breaking anti-trust laws – amongst other things, they were bundling its search engine and Chrome apps into the Android OS.

    While in the case of SABRE, the bias was programmed into their system, many of today’s systems acquire the bias from the environment and the kind of conversations that they are exposed to. Tay was a ‘teen girl’ twitter bot set up by Microsoft’s Technology and Research division who went rogue in under 24 hours. Since people kept tweeting offensive material to her, she transformed into a Hitler loving, racially abusive, profane bot who had to be taken offline. Auditing and controlling bias in systems such as this will require a different line of study.

    Hannak et al. speak about the personalization of search engines. Search engines are such an integral part of our lives and there is a high possibility that we may lose some information simply because the engine is trying to personalize the results to us. One of the main reasons (it seems) like this study was carried out was to avoid filter bubble effects. However, the first thing that I felt about personalization in search engines is that it is indeed very useful – especially respect to our geographical location. When we’re looking for shops/outlets or even a city, the search engine always points us to the closest one, the one we’re most likely looking for. Also, the fact that they correlate searches with previous searches also seems to make a fair bit of sense. This study also shows that searches where the answers are somewhat strictly right or wrong (medical pages, technology, etc.) don’t have a great degree of personalization.  But as far as search engine results go, personalization should be the last thing to be factored into ranking pages after trust, completeness of information, and popularity of the page.

Leave a Reply

Your email address will not be published. Required fields are marked *