Tuesday, November 28

More than two dozen human rights groups call on Zoom to halt emotion tracking software plans

Twenty-eight human rights organizations penned a letter to Zoom Wednesday, calling on the company to halt any plans it has for emotion tracking software aimed at assessing users’ engagement and sentiment. The groups say the technology is discriminatory, manipulative, punitive, a data security risk and based on pseudoscience. 

“Adopting the junk science of emotion detection on the Zoom platform would be a huge mistake,” Tracy Rosenberg, of Oakland Privacy, said in a statement Wednesday. “There is zero reliable evidence that a machine can accurately assess someone’s emotional state and a lot of evidence that one-size fits-all assumptions about ‘normality’ don’t mirror human diversity and punish out-groups for differences.” 

The letter, addressed to Zoom’s CEO Brian Yuan, comes in response to an article published last month by the technology publication Protocol, which reported Zoom was developing technology aimed at evaluating a user’s sentiment or engagement level. 

According to Zoom, the system, called Q for Sales, would assess users for their talk-time ratio, response time lag, and frequent speaker changes to track how engaged the person is. Based on these factors, Zoom would then assign scores between zero and 100, with higher scores used to indicate higher engagement or sentiment. 

It’s unclear where the company intends to follow through with its plan and the software is not currently implemented. But groups including the American Civil Liberties Union, Fight for the Future and Muslim Justice League are urging Zoom to “stop its exploration” of the technology entirely. 

“Our emotional states and our innermost thoughts should be free from surveillance,” senior policy analyst Daniel Leufer of Access Now said in a statement. 

The groups claim that the technology paves the way for discrimination against people with disabilities or of certain ethnicities by assuming that everyone uses the same facial expressions, voice patterns and body language to communicate. The groups also say the software could be a potential data security risk for users, making their information vulnerable to “snooping government authorities and malicious hackers.”

“It’s not hard to imagine employers and academic institutions using emotion analysis to discipline workers and students perceived to be ‘expressing the wrong emotions’ based on faulty Al,” Fight for the Future’s director of campaign and operations Caitlin Seeley George said. 

The groups have asked Zoom to publicly respond to their request by the end of the month. 

“You can make it clear that this technology has no place in video communications,” the letter to Yuan stated. 

A Zoom spokesperson told CBS News that it does not currently have a statement in response to the joint letter. 

Source link