Algorithms That Make You Think

Fourth Annual Virginia Tech Workshop on the Future of Human-Computer Interaction, April 11-12, 2019
Uncategorized

Reading Group: Interacting with Predictions: Visual Inspection of Black-box Machine Learning Models

Adam Perer

Overview: This is a case study for explainable AI. The case study example in the hospital environment to predict levels of diabetes in patients. Do we use a predictive model that is encompassed by a black box? A question on how the results are derived. Designed a system that has visualized transparency to the black box model by highlighting what factors of ‘glucose’ determine a high or low result of patients.

Kickoff Question: Feature called ‘glucose’ is used to predict the status of of diabetes in patients — but Figure 3 shows a dip around the ‘105’ mark. The data was missing but was replaced by an average value to fill the gap — but this changed the risk of the predictive model.

Would be interesting to hear how the hospital would want to use this system. Was the system used in the hospital “because we could” or “were doctors or patients questioned on what would be nice for the system”?

Who are the users for this system? Doctors, patients, data scientists, domain experts — all these classes of users give an important perspective to how to design the system and even the algorithm.

Patient’s want to disclosure information and how they could manipulate the values — which would give bad results. However, control for the patients could benefit this tool to help allow patients to make an informed decision about their health instead of being in the office.

Design issues: issue of explanation vs simplicity. The ‘feed-forward’ approach where you can manipulate the value BUT would be informed of what the data would do before changing the value. How much information do we display? How many dimensions are need to change the visualizations (in the form of sliders)? The medical field has

Explainable AI discussion: The analogue of being to understand what the black box is doing with the input to understand how and what the output is. The black box needs to be understood — the discussion yielded and analogue to dining out. The example posed a customer ordering from a menu, waiting for the meal, then is served the meal — but how different could it be to order the meal, then watch the chef step by step create the meal.

Leave a Reply

Your email address will not be published. Required fields are marked *