Summary
With the prevalence of technology, the mainstream programs that help the rise of it not only dictate the technological impact but also the direction of news media and people’s opinions. With journalists turning to various outlets and adapting to the efficiency created by technology, the technology used may introduce bias based on their internal sources or efficiencies and therefor introduce bias into their story. This team measured multiple algorithms against four different categories: prioritization, classification, association, and filtering. Using a combination of these different categories, these are then measured within a user survey to measure how different auto complete features bias their opinions. Using these measurements, it has also been determined by the team that popular search engines like Google specifically tailor results based on other information the user has previously searched. For a normal user this makes sense however for some investigative journalist these results may not accurately represent a source of truth.
Reflection
Noted by the team, there is a strong conflict in the transparency used within an algorithm. These transparency discrepancies may be due to certain government concerns dependent on certain secrets. These creates a strong sense of resiliency and distrust against the use of certain algorithms based. Though these secrets are claimed for national security, there may be misuse of power or overstepping of definition that overuses the term for personal or political gain and are not correctly appropriated. These kinds of acts may be located at any level of government, from the lowest of actors to the highest of rankings.
One of the key discussion points raised by the team to fix this potential bias in independent research is to teach journalists how better to use computer systems. This may only seem to bridge the journalist’s new medium they are not familiar with. This could also be seen as an attempt to create a handicap for the journalists to better understand a truly fragmented news system.
Questions
- Do you think introducing journalists into a computer science program would extend their capabilities or it would only further direct their ideas while potentially removing certain creativity?
- Since there is a type of monopolization throughout the software ecosystem, do you believe people are “forced” to use such technologies that tailor the results?
- Given how a lot of technology uses user information for potential misuse, do you agree with this information being introduced with a small disclaimer acknowledging the potential preference?
- There are a lot of services that offer you better insights to clean your internet trail and clear any biases internet services cache to ensure a faster and more tailored search results. Have you personally used any of these programs or step by step guides to clean your internet footprint?
- Many programs capture and record user usage with a small disclaimer at the end detailing their usage on data. It is likely many users do not read these for various reasons. Do you think if normal consumers of technology were to see how corrective and auto biasing the results could be that they would continue using the services?
To answer your first question, I also had skepticism about how computational journalists would affect the process of creativity. I was intrigued to know the computational background of people in this field. Yes, I am afraid that not having a thorough CS background might change the way they look at things. Why is an algorithm behaving a certain way is important to understand. It would be helpful if they have a dual degree in CS and journalism (major or minor). Having an in-depth CS knowledge might help them still keep the creativity technically along with being creative, unbiased journalists.