Reflection #6 – [09/13] – [Neelma Bhatti]

  1. Sandvig, Christian, et al. “Auditing algorithms: Research methods for detecting discrimination on internet platforms.” Data and discrimination: converting critical concerns into productive inquiry (2014): 1-23.
  2. Hannak, Aniko, et al. “Measuring personalization of web search.” Proceedings of the 22nd international conference on World Wide Web. ACM, 2013.

Summary

Both papers set to explore the invisibility and resulting personalisation (in some cases, discrimination) of recommendation, search and curation algorithms.

Sandvig et al. map the traditional auditing studies for determining racial discrimination in housing to finding recommendation and searching bias faced by users of e-commerce, social and searching websites. Hannak et. al. develop and apply a methodology for measuring personalization in web search results presented to users, and the features driving that personalization.

Reflection:

  • Having read the “Folk theories” [1] paper, one can’t help but wonder if search engines also use”narcissism” as a personalisation feature, other than several others perceived by Hannak et. al. The narcissism itself can be based on inferring similarities between web traces and search history, demographics etc of different users.
  • It would be interesting to quantitatively assess the if the level or type of personalisation differed based on the device used to log into a certain account (be it Amazon, Google, Facebook). I know for a fact that it does.
  • Algorithmic audits can also be used to investigate fallacious shortage of products on e-commerce websites to generate false “hype” about certain products.
  • As the saying goes: “If you’re not paying for it, you become the product”. So are we (the products) even eligible to question what is being displayed to us? (especially while using free social media platforms).   Nothing that we consume in this world from food to news to entertainment content, has a cost  associated with it, maybe in case of using these services, the cost  is our personal data and right to access information deemed irrelevant by the algorithm.
  • Seemingly harmless, in fact beneficial personalization strategy results in a more serious problem than filter bubbles. Relevant products being shown on the newsfeed based on what we talk with friends (I once mentioned going to Starbucks and had offers related to Starbucks all over my newsfeed) or what we see (The exact products from an aisle I stood in front of for more than 5 minutes on my newsfeed) invade user privacy. If algorithmic audits need to adhere to Terms and Conditions of  a website, I wonder if any Terms and Conditions about not invading the personal space of a user to the point of creepiness exist.
  • A study to determine if an algorithm is possibly rigged¹ can have users tweak the settings publicly available to change the working of an algorithm and see if the results still favors the ownership.

¹which probably all algorithms are, to a certain extent (quoting “Crandall’s complaint”:” Why would you build and operate an expensive algorithm if you can’t bias it in your favor?”)

[1] Eslami, M., Karahalios, K., Sandvig, C., Vaccaro, K., Rickman, A., Hamilton, K., & Kirlik, A. (2016, May). First i like it, then i hide it: Folk theories of social feeds. In Proceedings of the 2016 cHI conference on human factors in computing systems(pp. 2371-2382). ACM.

 

Leave a Reply

Your email address will not be published. Required fields are marked *