The Basque Country is implementing Quantus Skin in its health clinics after an investment of 1.6 million euros. Specialists criticise the artificial intelligence developed by the Asisa subsidiary due to its “poor” and “dangerous” results. The algorithm has been trained only with data from white patients.
Again, no.
There are actual normal reasons that can explain this. Don’t assume evil when stupidity (or in this case, physics) does it. Darker patches on darker skin are harder to detect, just as facial features in the dark, on dark skin are garder to detect because there is literally less light to work with
Scream racism all you want but you’re cheapening the meaning of the word and you’re not doing anyone a favor.
I didn’t, though? I think that perhaps you missed the “I don’t think necessarily that people who perpetuate this problem are doing so out of malice” part.
I didn’t invent this term.
Computers don’t see things the way we do. That’s why steganography can be imperceptible to the human eye, and why adversarial examples work when the differences cannot be seen by humans.
If a model is struggling at doing its job it’s because the data is bad, be it the input data, or the training data. Historically one significant contributor has been that the datasets aren’t particularly diverse, and white men end up as the default. It’s why all the “AI” companies popped in “ethnically ambiguous” and other words into their prompts to coax their image generators into generating people that weren’t white, and subsequently why these image generators gave us ethnically ambigaus memes and German nazi soldiers that were black.