Readings assigned
[1] Sandvig, Christian et al. (2014) – “Auditing Algorithms: Research Methods for Detecting Discrimination on Internet Platforms” – Paper presented at the “Data and Discrimination: Converting Critical Concerns into Productive Inquiry,” a preconference at the 64th Annual Meeting of the International Communication Association.
[2] Hannak, Aniko et al. (2013) – “Measuring Personalisation of Web Search” – Procedings of International World Wide Web Conference Committee. (527-537)
Summary
The readings assigned to us mainly talks about the algorithmic audits. The first paper which discussed about how algorithms are rigged and used to favor or bias in many instances. They have discussed the example of the computer system named SABRE which was created by American Airlines and IBM for the ease of airline reservation system. It was later found out that the results often were bias and American Airlines results were given an unfair priority as a result of the search. This led to the government to intervene to make the system much more transparent and accessible to other airlines when information was sought. Algorithms can also be rigged and this how the “reply girls” of YouTube became popular. The authors proposed five different algorithmic audit methods and their respective effectiveness and limitations. The second paper on the other hand focused mainly on the effects personalisation of an account or profile has on search engines web results. They conducted an experiment which focused on features like basic cookie tracking, browser user-agent, the geolocation and the google account attributes. They recruited Amazon Mechanical Turks and observed about 11.7% of the search results being different due to the personalisation.
Reflection
The need to understand how algorithms are designed and how they affect our interaction in the social media and internet in general is immense. The paper by Motahhare Eslami [3] discussed how algorithms are there in place to give a personalized news feed for Facebook. Algorithms are used to in all possible ways, be it sorting the search result, the priority of the results, filtering etc. However, to those who are not in the particular company the term algorithm is just another ambiguous one. The non-disclosure clause of many companies leads them to be so mysterious to the outer world.
The authors of the first paper highlighted several algorithmic audits namely, the code audit, the non-invasive user audit, scraping audit, sock puppet audit and the crowdsourcing audit. They mentioned the challenges each method. In my opinion, looking at only one method is not going to be optimal. If, we combine two or more audit methods, and give a weight to each method (sort of a weighted mean) and at the end compute the “audit score”. The weight being assigned to each of the audits needs to be adjusted depending upon the priority of the audit method. Thus, the audit score generated would give a comprehensive idea about the algorithm in place being a “rigged” one or not. Algorithmic audits can also be used to depict the fairness of businesses. [4].
The second paper talks about the ways that might lead to the search results being altered. Although personalisation leads to search results being altered, however it was also found that majority of the time the results were not altered. This result seems quite shocking to me. I believe a global data sampling would depict a completely different story. Comparing the both would give a better picture about exactly how personalisation is taken into consideration. Web search personalisation is an important and effective way to ensure a great user experience. However, the user needs to be aware exactly how and where their data is being used. This is where the companies need to be open and transparent.
[3] Eslami, Motahhare et al.- “I always assumed that I wasn’t really that close to [her]”: Reasoning about invisible algorithms in the news feed”
[4]https://www.wired.com/story/want-to-prove-your-business-is-fair-audit-your-algorithm/