Algorithm Watchdog

Expert workshop on algorithm watchdogs and visualizing bias in search engines

Users put their trusts in search engines. Biases in search results, therefore, have introduced negative impacts to users, from manipulating their knowledge when learning a new topic to introducing political polarization in society. Given that it is difficult to remove bias completely, one approach to address this is to improve user awareness of bias in search engines. Eight prototypes of bias-aware search engines were created using a participatory design methodology. These prototypes contain different features and different level of controls for the users. They have two underlying approaches: i) bias visualization, which informs users of biases found in the results, and ii) results reranking, which allows users to obtain a different set of results containing different levels of biases.

Through the use of search engines and various other ways, algorithmic decisions affect many areas in our work and private lives as well as potentially influencing societal events. However, it seems that effective oversight of such decisions is currently missing in most of these areas. To take the first step towards filling these gaps, we are conducting research on a potential algorithm watchdog.

Our first workshop, which focused on Algorithmic Decision Making in Recruitment and HR, produced a very stimulating discussion and helped us refine our thinking on how an algorithm watchdog could be established and might operate. As discussed at the workshop, one of our aims is to produce some published recommendations in collaboration with participants – we are currently working on a first draft of a methodology which we will share with you soon to elicit your input.

This second workshop will provide a summary of the draft watchdog methodology we have been working on since the previous workshop and a presentation on visualizing information bias in search engine from work done by colleagues in our CyCAT project. Participants in the workshop will have the opportunity to interact with the prototypes of bias-aware search engines that were developed in the CyCAT project. The main purpose of the session will be to explore the possibilities of applying this methodology to the area of monitoring bias in web search engines through discussions with the participants.

If you would like to participate, please use the following link to sign up by 13th May. 

If you would like to be added to our distribution list regarding upcoming events and news, please let us know by registering your interest on the following form:

If you have any questions please do not hesitate to contact Dr Lena Podoletz ( or Prof. Michael Rovatsos (

19 May 2021 5:00 pm (EET)


2 hours