’Accountability’. There’s a lot of discussion about it, but it seems elusive in many real world situations.
In fact, there is an alarming trend of people using automation in systems as an excuse to justify unfair treatment of others. Example: ”Oh, we can’t do anything, it’s in the system”. Or, ”I don’t have the authority to change things in the system (hence I won’t rectify your mistreatment)”. How many of us recalls hearing such responses from customer service and even from government organizations? At least I can recount many such cases in recent times.
On one hand, this is a symptom of bad process design, in which there is no backdoor for humans to fix matters. But the more alarming aspect is the complacency; people accept being powerless and no longer think about what is fair or reasonable — they are now outsourcing this moral judgment to the system (that was not designed for any morality in mind, such as considering the need to make exceptions to rules in the name of fairness).
Of course, we can’t have people deny responsibility of fairness on the grounds of automation. We, human beings, need to be in charge. There are always exceptional cases due to humane reasons that require manual intervention. No ”law” or ”rule” can be applied blindly, without considering alleviating factors. This applies to both laws of men and laws of algorithmic systems.
How can we address this problem? As observed by Dr. Dena Al-Thani, there is a need of cosidering user-centered design when developing these systems to be able to addresss scenarios of shifting responsibility to algorithms. This involves designing feedback loops and mechanisms for shared control of outcomes. The matter is not only a ’programming thing’, but requires awareness from business managers and service designers as well. In services marketing, scholars have been referring to ’empowerment’ of workers at least since the 1990s [1]. Currently, there are cases where a staff person can’t literally fix a customer’s problem because they lack access to some system. Such cases require new form of design thinking, that considers not only ”algorithmic fairness” but human judgment and empowerment. Flexibility for judgment calls needs to be part of organizational policy, and systems need to have engineered features that support exceptions to the rule.
(Read my previous post about algorithmic scapegoating.)
References
[1] Bitner, M. J., Booms, B. H., & Mohr, L. A. (1994). Critical Service Encounters: The Employee’s Viewpoint. Journal of Marketing, 58(4), 95–106. https://doi.org/10.1177/002224299405800408