News 

Facial Recognition Systems Are Even More Biased Than We Thought

“Yet it can be hard to tell when a data set is biased, especially when these systems are built by homogenous teams mostly consisting of white men” writes About The Author for fastcodesign.com. Even the existing tools that are meant to test algorithms can be biased.Take what’s known as a “benchmark data set,” basically a bunch of data that is used to assess an AI’s accuracy.She was able to test these major commercial systems by creating this new benchmark face data set rather than a whole new algorithm.The people in the data set are from the national parliaments of the African countries of Rwanda, Senegal, and South Africa, and from the European countries of Iceland, Finland, and Sweden.
 
Source: fastcodesign.com



Share This:

Related posts