Sunshine as antiseptic – Groups pays for coverage of opaque AI | Biometric Update

    Organizers of an AI research and advocacy group in Berlin say the world is still too passive about artificial intelligence. In a bid to change that trajectory, AlgorithmWatch has named fellowships for reporting on algorithmic accountability.

    Individuals, groups, companies and the occasional national government are working to shed light on the automatic decision making, but most of the code is based on black-box algorithms and the efforts are diffuse. That opaqueness hinders public knowledge and trust as well as efforts to regulate codes.

    AlgorithmWatch officials want to address part of that problem by naming six people (the original number was five) to six-month, €7,200 ($7,647) fellowships.

    They want the program to result in more reporting on AI – the algorithms, not just the topic — deployed specifically in the European Union. As part of the reporting, the organizers want coverage of people whose lives are affected now by the codes.

    One of AlgorithmWatch’s organizing principles is that societies cannot leave their futures up to AI and the crafters of AI.

    The six fellows are:

    Naiara Bellio, a digital rights reporter who will investigate the indiscriminate use of AI by governments, including in Spain.

    Pierluigi Bizzini, a journalist with a computer science background who has covered the social implications of automated systems on the rights of migrants, the indigent and minorities.

    Nathalie Koubayová, a PhD student fascinated with the social implications of chatbots. She plans to use her fellowship to examine the use of chatbots in mental health, ag tech and automated fact-checking.

    Jennifer Krueckeberg, who has a recent PhD in anthropology that examined the effect digital media has on young people’s personal memory practices. She plans to use her fellowship to look at how AI affects surveillance, education and daily lives, which could potentially include biometric algorithms.

    Kave Noori, a human rights lawyer who will examine the views of people with hearing disabilities on the ethics of robot interpreters.

    Sonja Peteranderl, a journalist who has written for Der Spiegel and Wired Germany, who will look into AI’s effect on the visibility of marginalized communities.

    Article Topics

    AI  |  algorithmic accountability  |  algorithms  |  AlgorithmWatch  |  EU  |  research and development

    #Sunshine #antiseptic #Groups #pays #coverage #opaque #Biometric #Update

    Latest articles

    Related articles

    Leave a reply

    Please enter your comment!
    Please enter your name here