Listen to this article here

On January 21, Drake performed at the Apollo Theater in Harlem, New York, where departing attendees were shocked to find an NYPD camera pointed at their faces.

New York Times music writer Jon Caramanica tweeted Saturday night that the NYPD was filming people leaving the show at the Apollo Theater.

He included video showing an officer recording those exiting.

The NYPD said on Monday the video would only be used for social media posting, but there were still calls for the material to be removed, prompting immediate concern from online privacy advocates about the purpose. targeted by the intrusive images.

The five-second clip has gone viral, with 20 million views and counting.

In a statement, the department said the officer was part of the local precinct’s social media team and was receiving video for a social media post about local events.

“The video will not be used for any other reason,” the department said.

Still, according to CBS News, the Surveillance Technology Oversight Project, a New York-based group that focuses on privacy and civil rights, called the video recording “deeply concerning” because participants were being monitored without their consent.

He demanded that the video be destroyed and that the NYPD reveal “if it was used for facial recognition.”

The group also reiterated that it wants states and cities to ban police facial recognition software and facial recognition software in general at sporting events, concerts and other public events.

In a statement to Eyewitness News, the NYPD said the officer seen in the blue jacket holding the camera is from the 28th Precinct’s social media team and the officer was taking video for an upcoming Twitter post that will put highlight local community events.

According to ABC NY-7, the 28th Precinct posted highlights of local events, trying to promote a positive relationship between the community and the NYPD.

Mayor Adams backs NYPD, dismisses privacy concerns of those ‘sitting at home’

“When you have those who are sitting at home in the corner of the room trying to find a reason to separate the NYPD from ordinary New Yorkers, then they’re going to say that,” Mayor Eric Adams said. “Bravo to this great captain in the 28th arrondissement. I know this neighborhood. I know this captain. He is very community-minded and community-focused and I commend him for that.

While Mayor Adams may applaud well-intentioned efforts, critics continue to argue that arbitrary police surveillance sets a dangerous precedent.

“Experts believe that facial recognition is so particularly dangerous, and more akin to nuclear or biological weapons, where it is so deeply harmful, it has enormous potential to harm our basic human rights, [and] safety of people,” says Evan Greer, director of Fight for the Future, a digital rights organization.

Some versions of the technology have proven less adept at differentiating between darker-skinned people in the past.

And Greer says traditional law enforcement oversight has also historically led to over-surveillance of communities of color. They fear that combining the two will lead to an amplified effect.

Facial recognition tech is criminally flawed against darker skin tones

“Facial recognition technology tends to misidentify people of color, and especially women of color,” says Hannah Bloch-Wehba, an associate professor of law at Texas A&M who specializes in privacy, technology, and democratic governance. . “And so I could see a serious concern about the kind of racial and gender implications of this kind of technology being used to screen people.”

In recent years, a number of black men have been falsely identified as suspects in criminal investigations using facial recognition software, in some cases resulting in arrests and wrongful charges.

According to NPR, currently, facial recognition technology is legal in New York City and there are no federal laws specifically addressing facial recognition.

“You can change your name, you can change your Social Security number, you can change almost anything, but you can’t change your face,” said Albert Fox Cahn, executive director of the Surveillance Technology Oversight Project (STOP ) based in New York. “So if your biometrics are compromised once, they’re compromised for life.”



Source link

Leave A Reply