Power Dynamics and Value Conflicts in Designing and Maintaining Socio-Technical Algorithmic Processes
Recent participatory practices and value-sensitive algorithm design contributed to building algorithmic sociotechnical systems that align with a community's values and needs. However, there is still little understanding around possible conflicts between stakeholders when defining values in the early stages of algorithm design as well as the power dynamics that arise when using and maintaining an algorithmic socio-technical system over time. In this paper, we study the power dynamics and conflicts around one of our community's algorithmic sociotechnical systems, the SIGCHI student volunteer (SV) selection system, which uses a weighted semi-randomized algorithm to recruit a desired pool of volunteers. Interviews with 24 community members with various roles showed that the SV selection process, although seemingly algorithmic, is a complex sociotechnical process in which the algorithm's outputs are interpreted and adjusted by the conference organizers in order to reflect the values of the community while ensuring that the group of selected volunteers can aptly help organize the conference. Such a sociotechnical process, in turn, provided a stage in which the existing power hierarchy and value conflicts among the stakeholders played salient roles in determining how the process was perceived and envisioned. For instance, our participants found the algorithm used in the SV selection process to be a power-balancer that places a check on the organizers who oversee the process. However, when we engaged participants in a participation practice to design the algorithm, prevalent value conflicts between participants resulted in a failure in capturing a clear consensus about what values the SV selection process should prioritize, or even what should be considered a value in the first place. Participants suggested that a potential solution might be value transparency—the type of transparency that focuses on explaining why a decision was made over how it was made—as a mechanism for resolving such conflicts. Based on our findings, we layout recommendations that communities can adopt to design and sustain algorithmic systems over time, especially in the face of power issues and conflicts.