--%>

Review a story about fighting crime


Problem: Read the passage below. Then, answer the question.

Here's a story about fighting crime: Police in Oregon identified crime suspects from hundreds of thousands of photo records. That's not easy. But they had help from facial recognition software called Rekognition. It's made by Amazon.

Facial recognition can help police solve crimes faster. Sounds good. What could possibly go wrong?

A lot, as it turns out. Researchers at MIT and the University of Toronto tested Rekognition. The test didn't go too well. It turns out that the software can sometimes identify the wrong people. And some of it has to do with race.

The technology made quite a few mistakes. It labeled darker-skinned women as men. It did this 31 percent of the time. Lighter-skinned women were misidentified 7 percent of the time. And darker-skinned men had a 1 percent error rate. But lighter-skinned men had no errors.

What's going on here? Experts think they know. They say artificial intelligence (AI) often has the same biases as its human creators. And the researchers say Rekognition appears to favor males, especially white males.

But Amazon says the study wasn't done fairly. Matt Wood works on AI at Amazon. He said the study used "facial analysis." It did not use "facial recognition" technology. There's a difference. Wood said facial analysis can spot faces. It can also identify things like whether a person wears glasses. But facial recognition is more precise. It can match individual faces to faces in videos and images. Wood also said Amazon has updated its software. He claimed the newer version makes fewer errors.

Jacob Snow is an attorney. He works for the American Civil Liberties Union (ACLU). Snow says it's not enough to update the software. Who's to say it's bias-free? Snow said Amazon isn't taking the "really [major] concerns revealed by this study seriously."

Privacy and civil rights supporters are worried, too. They want Amazon to stop selling Rekognition to police. They're worried about discrimination against minorities. They also say officials could use the software to observe people without their permission. They're not the only ones objecting. Some Amazon investors are, too. They have also asked the company to stop selling Rekognition. They fear that errors could lead to lawsuits.

The Associated Press contributed to this story.

Which BEST describes the overall tone of the passage? Need Assignment Help?

  • alarmest and exaggerated
  • highly supportive of Amazon
  • neutral but cautiously concerned
  • mocking and sarcastic

Request for Solution File

Ask an Expert for Answer!!
Other Subject: Review a story about fighting crime
Reference No:- TGS03487004

Expected delivery within 24 Hours