The ethical questions that haunt facial-recognition research

Coda magazine also noted that Springer Nature sponsored the conference; the company said its role was limited to publishing CCBR proceedings and that it had strengthened its requirements for conference organizers to comply with the publisher’s editorial policies after concerns were raised about past content. And Jain challenged the critique, telling Nature that attending conferences in China “does not mean that … international conference participants, like me, condone these atrocities against minorities”. Growth in surveillance there shouldn’t be a reason to “curtail scientific exchange”, he said.

Jain remains on the advisory board for CCBR 2020–21; Springer Nature is still publishing the conference abstracts. And major international computer-vision conferences have continued to accept sponsorship from Chinese firms. Just after the blacklisting, SenseTime and Megvii sponsored the 2019 International Conference on Computer Vision, and Megvii sponsored the 2020 Conference on Computer Vision and Pattern Recognition, although its logo was removed from the conference’s website after the meeting occurred. “Conferences should avoid sponsors who are accused of enabling abuses of human rights,” reiterates Walsh. However, he notes that last year, the non-governmental organization Human Rights Watch in New York City withdrew initial allegations that Megvii facial-recognition technology was involved in an app used in Xinjiang. Conference organizers did not respond to a request for comment.

Ethical checkpoints

Questionable research projects have popped up in the United States, too. On 5 May, Harrisburg University in Pennsylvania posted a press release declaring that researchers there had developed facial-recognition software “capable of predicting whether someone is likely going to be a criminal”, with “80 percent accuracy and no racial bias”. The announcement triggered a wave of criticism, as had previous studies that hark back to the discredited work of nineteenth-century physiognomists. One notorious 2016 study reported that a machine-learning algorithm could spot the difference between images of non-criminals and those of convicted criminals that were supplied by a Chinese police department10.

1 2 3 4 5 6 7 8 9 10 11 12 13

Share